A lot of people are scared automation will eliminate their jobs. That’s going to be true in some industries, but ours is not one of them. Cybersecurity professionals who are accustomed to dealing with existential threats need to flip their mindsets and think of automation as an existential opportunity.

There are millions of unfilled jobs in security today. The pressure on our people is tremendous, and burnout is a big problem. Cybersecurity professionals are similar to first-responders, addressing one crisis after another with no end in sight. The number of alerts is overwhelming, and a sense of futility can creep in.

Almost one in five cybersecurity professionals are not happy in their careers, despite lucrative compensation and the ability to pick and choose their employers. And it’s not just the defenders in the trenches who are mentally exhausted by constant hyper-vigilance; 17 percent of CISOs say they’ve turned to alcohol or medication to manage their stress.

The Maslach Burnout Inventory identifies three critical components of burnout: exhaustion, cynicism, and a lack of self-efficacy. Anyone who’s spent time in a Security Operations Center (SOC) can make the connection between these three things and the never-ending need to do too much work with too few resources without ever making a dent in the number or frequency of incidents.

Cybersecurity professionals can feel stuck. Stopping the next breach takes priority over getting additional training or bringing innovative ideas to life. It’s no surprise that 65 percent of cybersecurity professionals say they struggle to define their career paths. But while cybersecurity professionals may be dissatisfied with their jobs at times, most are still passionate about their work.  What they do is important, and they know it.

People in our industry want to acquire new skills, find more interesting career paths, and do more exciting work. Automation can enable that progression to occur.

The deep chasm between “intelligence” and “thought” 

People are beginning to think of AI as a human brain, but it isn’t. A machine can’t detect the nuances between one group of threat actors and another, or make the connection that shows a sophisticated group of actors incorporating commodity malware in a new style of attack. AI is just math, a collection of algorithms that perform tasks but lacks the intuition and associative memory-the spidey sense-that is a common trait of experienced cybersecurity professionals.

White hat researchers have been experimenting with the security of deep neural networks for some time, and we can be sure bad actors are doing the same. However, just as we worry more about an average worker succumbing to a simple social engineering exploit than we do about devious plots orchestrated by evil geniuses, we should focus on getting good decisions out of our AI instead of blindly moving forward based on what machines tell us to do. A human element is still essential, and that’s not going to change in the next 10 years.

What will change are the roles of cybersecurity professionals. Start by saying goodbye to the Tier 1 analyst.

The opportunity for fun and fulfillment

Automation doesn’t mean fewer jobs. It means better jobs. Many security professionals spend their days doing tasks simple enough to be automated, like finding things in data, seeking loose associations, and looking for things that don’t come out in traditional statistical analyses. For instance, look at the work a T1 analyst performs: a person in this role gathers information and decides whether to escalate. A machine can do that more reliably and at a greater scale than any human.

When routine activities are automated, T1 analysts can put their skills to better use on important tasks like incident management, understanding campaigns, and responding to incidents. These are experiences that will help a person remain relevant through career transitions. 

According to the “Voice of the Analyst” study conducted by Cyentia Institute (page 16, figure 17) , SOC analysts enjoy forensics and threat hunting, and these activities provide excellent value. However, minimal time is afforded to these activities; they are squeezed out by monitoring activities. AI enables automation of these low-value/high-time activities to afford more time for hunting and forensics in the SOC.

From a business standpoint, automation gives space to innovate where talent is at a shortage. This is especially significant for organizations that can’t find or afford a full security team. Without enough people, some work doesn’t get done  – and some risks go unmitigated. Automation fills staffing gaps and frees up security teams to perform more elevated activities and help their employers’ address emerging challenges.

Mature companies should already be planning to eliminate or uplevel T1 roles as we know them today. But the driving reason to adopt automation should not be to reduce headcount. It should be to optimize the skills already onboard and to bring in new talent at a higher level. Companies that do this can use AI as a recruiting aid: cybersecurity professionals are decidedly more interested in joining organizations that offer them time to learn on the job and the chance to work on projects that further their careers. Businesses that can’t provide those opportunities will pay more for what they’re getting and have difficulty retaining talent.

The freedom to innovate

Cars today can warn drivers a collision is about to occur, but they can’t prevent a crash from happening. The human driver is still in control. And that’s about where we are in the development of AI: no machine can understand the total risk to the environment, nor can it instruct itself on the best steps to take when faced with the unknown. A human still needs to be at the wheel.

Security professionals are typically innovators. They’re not going to leave AI to run on its own with no oversight or tweaking. They’re going to use it to their advantage, giving it tasks that make sense for a machine to handle while they branch out into new areas of expertise, address more interesting problems, and create more exciting career paths for themselves.

So for at least the next ten years, we won’t see automation replacing analysts on a wholesale basis. AI will be like a car’s collision avoidance system, as opposed to a chauffeur – a helper, not a replacement. AI may aid in solving a particular problem today, but there’s no assurance it will be as useful when attack trends change tomorrow.

Yes, AI can compress detection times and build situational awareness, but don’t make the mistake of believing it has infinite value. An experienced cybersecurity professional is far more valuable than any artificial intelligence system can be, and that will remain true as long as attacks are the products of human minds.

By Jason Lamar, Sr. Director, Product Management, Security Business Group, Cisco