Cybersecurity has become a major spending priority for security leaders worldwide, but many still lack confidence in their cybersecurity posture. This occurs because businesses perceive cybersecurity as a technology problem that they can solve by throwing more money at it. Contrary to this popular belief, 77% of cyber-attacks are caused by human failures, while only 23% are because of technical glitches.
So what causes these human failures and why are we so susceptible to manipulation? An understanding of human nature and behavioral science should help organizations pivot their approach to cybersecurity.
The subconscious mind has been hardwired to take mental (cognitive) shortcuts (heuristics) that help us digest or break down information as quickly as possible. Such biases influence how we think and behave, and also have impact on decision-making. For example, sharks are considered far more dangerous to people than mosquitoes, even though the insect kills more people in a day than sharks do in 100 years.
Since the vast majority of cyberattacks begin with a phishing email, it’s clear proof that the practice of social manipulation has spread everywhere and has obviously been successful. Cybercriminals leverage cognitive biases and launch sophisticated and personalized phishing attacks that can manipulate a person’s behavior enough to click suspicious links. Let’s understand the five most popular cognitive biases that result in phishing attacks:
- Halo Effect: The tendency to have a positive impression of a person, company, brand, product, or service. Cybercriminals often impersonate trusted entities like a bank or a reputable organization leading people to open malicious attachments, clicking on malicious URLs or visiting malicious websites.
- Hyperbolic Discounting: The inclination to choose smaller rewards over larger rewards that come later in the future. For example, most of us tend to fall for “free trials'' or “free coupons” and happily give away our credit card information without considering possible long-term negative outcomes.
- Curiosity Effect: Curiosity works like an itch that the victim needs to scratch. People are naturally curious and often indulge in risky behavior to satisfy a craving. Cybercriminals manipulate readers on email and social media by crafting messages (news headlines, advertisements, and other clickbait campaigns) that arouse curiosity.
- Recency Effect: The tendency to remember the most recent events that can result in poor judgements and bad security behavior. For example, most security teams admit to ignoring one-third of all security alerts since a majority of them are false positives.
- Authority Bias: People are unconsciously more influenced by those who are in a position of authority. Business email compromise (BEC) scams are one of the most financially damaging cybercrimes (nearly $2 billion 2020) and uses authority bias as a means to defraud users. For example, employees in the finance department will suddenly receive a fraudulent email from the CEO with instructions to transfer large sums of money.
How to change behaviors
As cyberattacks grow in volume and sophistication, the need to tackle human biases head-on becomes even more critical. Working remotely has become the new norm and has accelerated; and employees in isolation are more susceptible to scams than ever before. Technical controls like anti-virus and intrusion detection can filter out some maliciousness, however thwarting spearphishing and other forms of social engineering requires training people to sort out scams. They are the last line of defense. Below are three recommendations that can help organizations get started:
- Practice consistent and personalized training and engagement: Organizations can only achieve long-term behavioral change through regular training exercises. To prevent breaches, organizations must consistently test workers with real-world phishing simulations on new and emerging threats. Employees should receive personalized coaching and guidance based on their aptitude, susceptibility to risk, particular job roles and department.
- Make cybersecurity part of the core culture: Leadership must recognize cybersecurity as a foundational element of their organizational culture and not consider it as some discrete risk mitigation initiative. Companies can only develop a cybersecurity culture if leadership fosters an environment where positive security attitudes and behaviors are encouraged and celebrated.
- Use a data-driven approach to measure attitudes: Companies may find cybersecurity culture difficult to measure and quantify, but it’s not impossible. Start by creating a baseline assessment of employee awareness, behaviors and perceptions, and create a long-term strategy to track and improve those metrics over time.
Businesses tend to treat cybersecurity as a technology problem, completely ignoring the human side of the equation. This needs to change. It’s time we move on from human-as-a-problem to human-as-a-solution.
Perry Carpenter, chief evangelist and strategy officer, KnowBe4