Security Program Controls/Technologies, Threat Management

Security robots market set to more than quadruple by 2030

A Naval Surface Warfare Center mechanical engineer controls a Boston Dynamics Spot robot
A Naval Surface Warfare Center mechanical engineer controls a Boston Dynamics Spot inspection robot during a repair technology exercise on Aug. 24, 2022, at Port Hueneme, Calif. (Navy)

Polaris Market Research on Tuesday reported that the global security robots market size and share was valued at $27.32 billion in 2021 and the researchers expect it to surpass $116.44 billion by 2030 — growing at a compound annual growth rate (CAGR) of 17.65% during the forecast period.

The researchers explained that security robots incorporate artificial intelligence (AI), streaming video, and other connected technologies to perform security duties that used to be done by humans. Leading applications of security robots include spying, explosive detection, dynamic mission planning, firefighting, de-mining, rescue operations, transportation, and patrolling.

Industry researchers said that these robots will continue to play an important role in the security business throughout this decade.

 “The shortage of trained security personnel makes turning to automation — specifically robots — very attractive,” said Bud Broomhead, chief executive officer at Viakoo. “However, this isn’t a question of security robots replacing humans — it’s how the combination of the two can be combined into an effective security strategy. It should be thought of as augmenting the existing workforce, not replacing it.” 

Using automation to solve scale problems can be, if not unchecked, a siren call, said John Bambenek, principal threat hunter at Netenrich. Bambenek said automated facial recognition systems have been shown to have disproportionate inaccuracy in minority populations.

“Automation for ad tracking or natural language processing have a very low cost of error,” said Bambenek. “When the cost of error is high, like with self-driving vehicles, much more care needs to be taken. When law enforcement (and by extension security) get things wrong, human rights violations can result. The only way to do this safely is to have a human in the mix, but make no mistake, we don’t have an “OWASP” for data science — and machine learning’s track record in dealing with datasets with intentional manipulation is not great.”

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.