Content

Threat automation, decentralized architecture among emerging post-COVID cyber trends

A recent report points to a a cadre of new or emerging technologies that are filling a specific post-COVID need for businesses. Seen here is the Kaspersky Lab booth at Hannover Messe, where the company presented its cybersecurity solutions for Industry 4.0, focusing on machine learning and IoT, among other things. (Photo by Joern Pollex/Getty Image...

By now, it’s news to approximately no one that the coronavirus pandemic has dramatically and perhaps permanently altered the way companies do business.

As COVID-19 has upended our way of life and sent workers home, it has also caused widespread reevaluation of emerging IT and security trends. New research this week sheds new light on how the pandemic is impacting innovation priorities and technology adoption in the cybersecurity space.

Technology research firm Gartner lists a number of new or emerging approaches to information security in its new report on top strategic technology trends for 2021. One such development expected to take off in the coming years is growing adoption of “cybersecurity mesh,” a phrase they give for organizations who rearchitect their networks, systems and access policies to fit with their new mobile, distributed workforce.

Similar to mesh networking, the idea is to flatten the hierarchy of IT networks, assets and connections away from a centralized HQ boundary and toward a more decentralized architecture. Gartner believes more organizations will look to adopt this approach to enable “any person or thing to securely access and use any digital asset, no matter where either is located, while providing the necessary level of security.”

It’s part of a cadre of new or emerging technologies that are filling a specific post-COVID need for businesses: what Gartner calls “location independence” or the need for IT and security functions to support different people and parts of the business process regardless of where they are in the world.

Other emerging trends flagged by Gartner revolves around three computation technologies that are designed to enhance the security or privacy of an organization’s data: confidential computing, decentralized machine learning as well as homomorphic encryption, secure multiparty computation and zero knowledge proofs. All of these tools are designed to “safely share data in untrusted environments,” something that has become more urgent this year as workers log in to work systems from their home networks and share sensitive data outside the office.

Another report released this week, a survey from MicroFocus of 410 IT security executives at large companies in the U.S., Germany, Japan, India and the United Kingdom, found some surprisingly strong adoption numbers for emerging or emerged security tools and processes. For example, machine learning and artificial intelligence still face questions around maturity and correct application, but that doesn’t appear to be stopping most organizations from dipping their toes in. More than 93% of organizations say they use either ML or AI in parts of their security operations products, and the number one reason for doing so is improving threat detection.

At least 11 other tools are expected to tip over into common use by 2021, according to the MicroFocus survey, many of which are tied to the desire for better threat detection. They include security configuration management, security information and event management systems, network traffic analysis, threat intelligence platforms or services, patch management, log management, security data lakes, security orchestration, automation and response, threat hunting and user and entity behavior analytics. All 11 are currently used by at least half of the organizations who responded, while at least 80 percent of organizations expect to be using all of them by next year.

Wanted: more robots and humans

The MicroFocus report found widespread concerns around threat detection, particularly around the volume of threats and dearth of human talent, and this anxiety “overshadows all other aspects of security operations.” While organizations are leveraging automation, machine learning tools, or security information and event management systems, it’s not enough to keep up with the threat landscape or make up for a lack of human capital. Investigating, validating and prioritizing security incidents was rated the most daunting challenge facing IT security operations teams.

“There’s clearly no shortage of threats, but there’s definitely a shortage of personnel to detect and analyze them,” the report notes.

That could speak to the need for more automation throughout the threat intelligence process to help under-resourced organizations process and analyze the flood of indicators and data flowing into their systems. Gartner also lists “hyperautomation,” the desire to automate as many business and IT processes as possible, as a growing tendency at many companies.

However, in interviews with SC Media, vendors in this space say there are still a number of technical or practical obstacles to automating further parts of the threat intel chain.

For example, good, standardized, clean data is critical for automating higher level threat intelligence and detection functions, as well as making connections between disparate events to provide actionable insights for many companies. Some programs, like the Automated Indicator Sharing program set up to share threat indicators and other data between government and the private sector, have floundered as most companies have declined to share their own data back and complain that what data they do get from the program is useless or lacking critical context.

“The biggest problem is being able to apply context to a lot of these signatures, regardless of where you get them,” said Tom Gorup, vice president of security and support operations at Alert Logic, a company that sells a managed detection and response solution. “Wherever your intel sources are coming from, signatures are false a lot, so you need a solid base of understanding of what you consider to be high fidelity [data] in order to create those correlations, the automation of saying if X happens and Y happens, this thing might be occurring.”

Derek B. Johnson

Derek is a senior editor and reporter at SC Media, where he has spent the past three years providing award-winning coverage of cybersecurity news across the public and private sectors. Prior to that, he was a senior reporter covering cybersecurity policy at Federal Computer Week. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.