It’s no coincidence that cybersecurity terminology borrows heavily from medicine: bugs, immunity, infections, and viruses. And, as we all know, based on the advice of washing our hands for the umpteenth time and singing “Happy Birthday” twice, prevention often works better than a cure.
In cybersecurity, as in medicine, achieving good outcomes requires as many people as possible practicing smart behavior until they become habitual across the organization. Basic physical hygiene such as hand washing, teeth brushing and coughing with one’s mouth covered are taught when we’re young by parents with an arsenal of tips and tricks encouraging us along. And why do we consider good cyber hygiene so important? Well, because anywhere up to 90 percent of data breaches are caused by human error and are therefore preventable.
We see the so-called "nudge" behavior theory in many places, where positive reinforcement and subtle suggestion influences behavior and decision-making. Consider where two parents want to help their respective children brush their teeth before going to bed. One parent decides to attack the problem head on by placing a “sugar tax” where they deny the kid desert if they have not brushed their teeth the previous evening. The other parent chooses to play a balance game – standing on one foot while brushing. They hope the kid’s attention gets transferred from the annoying tooth brushing routine to the enjoyment of the game. Both parents are trying to influence their child’s behavior, but using two different behavioral science methods.
Considering that the vast majority of data breaches involve at least some element of human error, it’s demonstrable that cybersecurity is a human problem that requires a human solution. When children grow up and go to work, organizations go out of their way to spell out exactly what they’re expecting from their staff and contractors when it comes to cybersecurity. They create awareness programs; they make people sign contracts and put together some great looking computer-based security training. And, for the most part, employees ignore these programs and get on with their jobs.
Dan Ariely, professor of psychology and behavioral economics, gave a fascinating insight into his own behaviour with Duke University’s online filing system. As a man who travels frequently and uses less than optimal internet connections on the road, he found uploading his research cumbersome and slow. The professor, by his own admission, had a decision to make: either use the university’s file-management system that won’t load over the VPN and get nothing done. Or, use his own unsanctioned process (shadow IT) and get his work done.
From a behavioral perspective, there’s a conflict of priorities. The university’s IT pros want to keep Duke’s network and data secure, but Ariely’s wants to get his work done. Importantly, Ariely doesn’t think he’s betraying his organization; he thinks what he’s doing makes sense. From the university’s perspective, Ariely can potentially jeopardize the entire network. But to Ariely, the university has put obstacles in the way of his productivity. Most people are dedicated to their work and want to get it done. And they’ll ignore locked-down operating environments when they get in the way.
The following represent some strategies for getting people to embrace security programs and policies:
- Social proof.
Generally speaking, when people don’t know how to act, they imitate the behavior of others. It’s easy to find “social proof” everywhere. For instance, when somebody sees that 40 people have given a rental 4.5 stars on Airbnb, they are more likely to think that they’ll also like it. Research shows that depending on the circumstances, when a committed minority, sometimes as little as 10 percent, reaches a certain size (critical mass), the social system crosses a tipping point and the rest of the crowd follows.
Sometimes, we use social proof against ourselves. Take the common negative messaging around choosing a password. When we tell people, “Here's a list of really bad passwords that most people use,” individuals think, “if everyone else has a bad password, it can't be that important for me to have a good one.” In contrast, if were to say that “70 percent of your colleagues have a stronger password than this,” it may encourage employees towards more secure cyber behavior.
- Optimism bias.
According to this year’s Cyber Security Breaches Survey, 46 percent of businesses reported having had a cyber-attack or breach in the previous 12 months. And yet, the majority of executives are irrationally optimistic that if something hasn’t happened before, it’s unlikely to happen to them. Or worse, that lightening never strikes twice. “We’ve had one attack; it’s not going to happen again.” And when it comes to employees: 77 percent recently said they weren’t worried about security when working from home and moreover believed their organization was at relatively lower risk than their competitors. The reason for this way of thinking? Misplaced optimism.
This attitude should sound familiar to those selling cyber security products, particularly when the top people are non-technical. Organizations can overcome this bias by connecting risk areas that executives are already familiar with and care deeply about, and show how that risk gets compounded without sufficient defenses. For instance, many CEOs are concerned about catastrophic customer data loss, but still many haven’t made the link between ongoing security management and avoiding or successfully fighting off an attack. More pertinently still, those same executives often fear that a significant data breach will may bring about the end of their tenure as CEO.
- Encouragement vs. fear.
While some may prefer to be feared, rather than loved, these are often highly-damaging when trying to encourage good cyber hygiene. When people are encouraged to believe in their own abilities to maintain cyber security, the research shows that they try harder and consequently become more secure in reality as well as in their minds.
By understanding staff through the lens of behavioral science, we can encourage a security-conscious culture that will demonstrably help reduce overall cyber risk for organizations. Our employees are not children, but when we nudge our children to wash their hands, look both ways when crossing the road and clean their teeth, their behavior – however incremental - also changes for the long-term. The same holds true for our adult colleagues, so why not give them a nudge and see what happens?
Jerome Robert, managing director, North America, Alsid