Security Staff Acquisition & Development

Aptitude test pinpoints brains best suited for cyber jobs

Cybersecurity operations within the U.S. Air Force, February 2021. The Air Force has been using a cyber aptitude test from Haystack Solutions to assess which airmen are best suited for various infused roles. (U.S. Air Force Photo by Maj. Christopher Vasquez, Public domain, via Wikimedia Commons)

If I were ever inclined to parlay the knowledge I’ve accrued at SC Media into an actual cybersecurity career, I now know which infosec jobs are best suited to my unique patterns of thinking. And, for that matter, I'd know which jobs I might want to stay far away from.

That’s because I recently subjected myself to a comprehensive cyber aptitude test that is available for commercial adoption after previously helping military institutions identify promising new security talent and assign them to specific roles.

Originally developed by a team of cognitive psychologists, psychometricians and cyber experts at the University of Maryland (UMD), this Cyber Aptitude and Talent Assessment (CATA) was launched by UMD startup Haystack Solutions to measure test-takers in five “cerebral dimensions”: critical thinking, exhaustiveness of approach and practices, initiating behaviors, real-time effectiveness, and responding behaviors. Data related to these dimensions can be mapped along an x and y axis into four quadrants, representing different areas of cyber expertise: offensive operations, defensive operations, analytics/forensics, and design/development.

The test is designed to remove both gender and socioeconomic bias. Rather than measure one’s familiarity with cyber concepts or experience in the field, CATA’s focus is based primarily upon cognitive ability and motivation to learn.

“What CATA helps thin out is whose synapses are really firing in the right ways, versus somebody who might have just gotten a certification,” Haystack Solutions CEO Doug Britton told SC Media. “Are you thinking like a cybersecurity practitioner?”

This past July, Haystack publicly shared the full results of an 18-month UMD study, conducted back in 2015, which confirmed that CATA testing of Air Force airmen was highly effective at finding the military branch’s top cyber performers. Indeed, the test successfully identified 97% of Air Force recruits who would go on to achieve “elite” scores on a USAF IT fundamentals course (e.g. a 90% course average or better). CATA also managed to distinguish between high-skill and untrained USAF cyber personnel with 84% accuracy, according to a press release on the study.

“The Air Force came to us with a problem and said, ‘We train people to become cyberwarfare operators – and we want to pick people for this career field that will be successful in training, and on the job,” said Haystack Chief Technology Officer Michael Bunting, co-inventor of the CATA technology, and also director of cognitive security and information operations at University of Maryland’s ARLIS center. “The ideal person is one who gets through the training in one pass, and doesn't have to retrain,” he explained, in an interview.

The U.S. Special Operations Command (SOCOM), U.S. Navy, and West Point have also implemented the aptitude test “to create some of the highest performing cyber teams,” Bunting further stated in the press release.

One of the most eye-opening non-government case studies involved the University of North Georgia (UNG). Beginning in 2019, the school has administered the test to current and incoming students to determine which pupils were best suited to join the CyberHawks, the university’s cyber operations team that competes in various competitions, including the NSA Codebreaker Challenge.

“They were able to amass an incredible team that went from 30 people in 2018 to over 250 people in 2020,” said Britton, while increasing the number of women on the team from about 10% to approximately 20%. “Many had never studied cyber or software development. And a lot of these women are now nationally competitive in hacking competitions,” said Britton.

Upon studying the test data of people competing in capture-the-flag (CTF) competitions, “we showed that [CATA] was 60, 70-plus percent effective at predicting who is going to do best,” Britton added.

Doug Britton, CEO, Haystack Solutions.

As for the UNG team's performance in the Codebreaker Challenge, “they dominated the competition so much… that if teams two through 10 had competed as one team, they still would have lost to the University of North Georgia,” said Britton.

Most of these cyber team members – who ranged from computer science majors to psychology or criminal justice students – have gone on to continue their pursuit of cybersecurity – because “when you're naturally aligned with things that interest you and that you like, you want to keep doing it,” Britton continued.

“It has been heartening to see [CATA] adapted for the commercial sector and, in early trials, to help identify previously unexplored but inherently genius-level cyber talent in schools and universities, who are now garnering some of the most prestigious CTF awards, and who had not previously considered cybersecurity careers,” noted Bunting in the press release.

Indeed, CATA could be yet another invaluable tool to help businesses and educational institutions source and develop untapped talent to help fill the cyber skills gap.

“For companies that are trying to hire, it helps them hire right the first time. They're able to sort through candidates and make sure… they've got the right mindset for the job,” said Britton. And the test can even help companies save money on upskilling by pinpointing the employees who are most likely to excel at training.

“The washout rates for upskilling programs are pretty bad in many cases because you don't have a good sense of who's going to be able to complete the training in advance,” Britton noted. “The problem is, you have to spend the money before you know – [but] this gives you the ability to separate out where that money should go.”

My very own test results

The test itself is divided into 14 separate segments, and I tried all of them. Each segment was intended to measure a particular cognitive ability that correlates well to certain cyber jobs. Currently, test-takers are measured against a baseline of government and military cyber personnel who have also taken the test. Even scores that land in 30th to 50th percentiles are still considered strong indicators of aptitude because they are being compared to individuals with federal cyber experience. And those who score even higher at certain tasks are likely to fare especially well at jobs using those corresponding skills.

“If you hit the midpoint here – if you hit 50% – that's actually really exciting,” said Britton. “50% is a big deal. You're swimming with some very competitive practitioners.”

So with that perspective in mind, just how well did I do? Let’s take a closer look.

Critical Thinking: This was my most up-and-down category. I performed exceptionally well on a “Remember and Count” exercise designed to measure my visuo-spatial working memory. This task reminded me a bit of the old children’s electronic game “Simon.” I watched as the north, east, south and west quadrants of a diamond shape would light up in various colors, and then I had to remember which colors blinked where and in what order. For that one, I scored in the 99th percentile.

I also performed admirably at a complex problem-solving exercise. However, I did not fare nearly as well at inductive rule learning and spatial visualization. For the latter exercise, as I was asked to imagine folding differently shaped pieces of paper in various ways, and then poking a hole through the multiple folds of paper with a pencil. Next, I had to imagine unfolding the paper, and figure out where all the holes would be. That broke my brain, and I scored in the 28th percentile.

“You work well with problems where the entire task is right in front of you,” said the test, which is a nice way of saying that I have a tough time visualizing multiple moves ahead.

According to Haystack, such skills are especially important for cyber pros specializing in network security who need to visualize complex network topologies. “What I've seen people do in troubleshooting that are good at that is just mind-blowing,” said Britton. “What they do in seconds, you could give other people 10,000 years and they wouldn't close the gap.”

So it appears I may not have a bright future in the network security space… or in origami or paper airplane folding, for that matter.

Aside from these four skills tests, participants are also given a survey that aims to measure to what degree they enjoy and crave cognitively strenuous activities on a day-to-day basis.

Exhaustive: This particular category doesn’t so much measure skills as it assesses one’s personality and philosophic approaches related to problem-solving and taking risks. CATA's assessment: I prefer to take my time and explore all options rather than committing to an answer too hastily. As a result, I’m better suited to more exhaustive search problems and open-ended monitoring.

Additionally, it was determined that I have a low-risk tolerance, meaning I may be uncomfortable making snap decisions that could result in a damaging outcome. In fact, 87% percent of test-takers were willing to take more chances than I was. However, this is not necessarily a negative outcome, Bunting explained.

“I don't know that it's bad to be risk averse for some cybersecurity jobs or tasks,” said Bunting. “Because if you're risk averse, you might be more likely to find the adversary or be conservative in the assumptions that you make, or what you allow to get through a firewall, or something like that. So it's not necessarily a bad thing. It’s just a different way of reacting.”

Initiating: According to CATA, this section examines “the ability to build models and make connections to generate novel solutions.” I thoroughly conquered this section of the test, which features mini-games that measure creative thinking and the ability to construct mental models.

To gauge your creative thinking skills, CATA’s “Remote Associates” game presents you with a trio of words and gives you just 20 seconds to think up a fourth word that can be logically paired with the other three. For instance, the words “call,” “pay” and “line” can all be matched with “phone.”

“Your ability to see and test potential connections between facts will help you quickly identify potential relationships between indicators of attack or faults in system behavior,” the test results page informed me, after I scored in the 99th percentile.

A practice screen from the "Spatial Integration" portion of Haystack Solutions' Cyber Aptitude and Talent Assessment.

The “Spatial Integration” or mental modeling portion of the test asks participants to look at a set of four objects, shown only two at a time to reveal one object’s positioning in relation to another (i.e. horizontally adjacent or vertically adjacent.) Based on these pairings, you are asked to envision the wholistic picture of how all four objects are positioned relative to each other.

I had a technique for this one, manipulating my fingers in real physical space to mimic how the objects looked on screen. It worked: I scored in the 97th percentile.

“You're able to fit the proverbial 'pieces of the puzzle' together, rapidly testing, rejecting, retesting hypotheses,” the test told me. “In responding to cyberattacks, this will help you rapidly survey your network and understand the methods the attacker has deployed against you.”

“It makes a lot of sense,” said Bunting. “As a journalist, you’re probably very practiced in creating stories and figuring out how pieces fit together… That's the underlying ability.”

Real-Time: There’s just one task for this category – a psychomotor speed test that asks you to look at a shape and then very quickly determine if the next shape that flashes is a match or mismatch.

I nailed this skill, scoring in the 92nd percentile. “Your quick reactions and problem-solving are resilient to distractions. This would allow you to, for example, execute a penetration test, without being impeded by irrelevant system feedback or warnings,” the test explained.

Responding: No embarrassing results here, though I also didn’t knock any of these four tests out of the park.

I hit the 51st percentile on a pattern recognition/anomaly detection task that flashed pairs of shape sequences in front of me and asked me to select one pair or the other, without explicitly giving instructions on the rules to the game.

On coding speed, I scored in the 62nd percentile. For this test, I was given a number-to-symbol cipher, below which were a series of symbols lined up in a row. As fast as possible, I had to type in the numbers corresponding to those symbols.

I finished in the 73rd percentile on a pattern vigilance test that was effectively a combination vision and concentration test. The screen flashed tiny O’s and tiny D’s in front of a busy background and I had to press a button whenever an O appeared. Not as easy as it sounds.

And I reached the 82nd percentile in anomality detection – a game that instructed me to spot rules-based coding mistakes, represented by a grid of white and black dots.

The final verdict?

“This is a very well-rounded mental profile, so you can jump between mental extremes,” said Britton, noting that the test indicates I could probably pursue any number of cyber career paths.

But perhaps the job I’m best suited for lies in offensive red-team security, according to the experts.

Gazing into the Haystack crystal ball, my hypothetical second career in cybersecurity might be as a pentester or ethical hacker – “the person who goes in and just kicks down the door… versus the one who might be a little bit more thoughtful, building the castle, that’s a bit more defensive,” said Benjamin Laimon, chief operating officer at Haystack. With that said, there were some strong indicators of defensive skills as well, he added.

Alternatively, I also appear to be “forensics-minded,” enabling me to “go in and put the pieces back together and figure out what went wrong and how to take action” following a security incident, Laimon continued.

“It looks like you have your choice” of cyber career, he added.

Good to know – although for now, I think I’ll stick with what I know best – cyber journalism. But maybe that’s just my "low risk tolerance" talking.

Bradley Barth

As director of community content at CyberRisk Alliance, Bradley Barth develops content for SC Media online conferences and events, as well as video/multimedia projects. For nearly six years, he wrote and reported for SC Media as deputy editor and, before that, senior reporter. He was previously a program executive with the tech-focused PR firm Voxus. Past journalistic experience includes stints as business editor at Executive Technology, a staff writer at New York Sportscene and a freelance journalist covering travel and entertainment. In his spare time, Bradley also writes screenplays.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.