When a data security incident happens, never say “Never again!” according to Josiah Dykstra, technical fellow at the National Security Agency’s Cybersecurity Collaboration Center.
Dykstra and co-presenter Douglas Hough, senior associate at Johns Hopkins University's Bloomberg School of Public Health, warned Black Hat conference attendees on Thursday about the dangers of having reflexive, knee-jerk reactions to data breaches and other unwanted infosec events.
The co-presenters noted that end users, security professionals and cyber leadership can in their own ways all fall victim to “action bias,” a psychological phenomenon that leads people to erroneously conclude that taking a swift action is better than doing nothing at all, even though sometimes it’s better to take a breath and carefully deliberate and contemplate next steps.
As a metaphor, Dykstra and Hough cited a study that found that soccer goalies who stay longer in the center of the net when an opposing player is about to take a penalty shot are more likely to block the goal attempt than those who try to anticipate the shooter’s movements and move left or right early.
Action bias can occur immediately after an attack occurs, such as when a phishing email with urgent messaging tricks an employee into clicking a malicious link or when an organization must decide to pay a ransomware demand before a countdown clock runs out. "Phishing works is because people have this intuitive action to act without thinking beforehand," explained Dykstra,
But action bias also happen in the ensuing days and weeks following an incident, as organizations seek to mitigate the damage and prevent a similar situation from happening in the future. For instance, Dykstra said that security and business leaders often feel compelled to save face after an incident by publicly declaring, “Never again!” But those are “two of the most dangerous words in all a cybersecurity,” said Dykstra. “I hear this a lot in my life, and I think that lots of us hear it from politicians and business leaders… which is: ‘Man, that crisis was difficult. We can never live through that again.’”
But this line of thought can cause more harm than good. For one, it can cause companies to radically blow past their budget restrictions.
“It encourages people to try anything and everything to stop it from ever happening again. That is a real recipe for wasteful and wrongful spending our resources,” Dykstra said.
Secondly, such statements can hurt company morale and reputation when something inevitably does happen again. “It actually makes an incorrect presumption,” said Dykstra. “’Never again’ sets this unprecedented goal that we can be 100% successful – when in fact, the attackers are going to keep attacking, and it sets a really negative situation for at least for those of us in cybersecurity.”
“It leads to more stress than it's worth,” he added later.
To counter the effect of action bias, Dykstra recommended businesses adopt a comprehensive risk management program that operates on a year-round, day-to-day basis, to help determine where one’s greatest vulnerabilities lie and where preventative steps are most needed and realistic.
“Done well, that should reduce the amount of action bias that occurs, because we've done preparation. We've thought ahead about the problem,” said Dykstra. “If we understood how much… our assets are worth, and we devote the appropriate amount of preparation and appropriate amount of security, [then] we're not living in a world of ‘never again,’ but we're lowering risk, minimizing risk, as much as possible without spending all of the all of the corporation's resources to try and get it to zero.”
Dykstra also recommended a combination of planning, playbooks and tabletop exercises to prepare for various threat scenarios. Because when we "end up in a new interesting situation that we didn't prepare for, we might end up jumping to the wrong or even a contradictory decision in the end," he said.
Planning and preparation especially means shifting "all of your thinking to before the crisis, so that in the moment you are prepared and you have practiced and you're ready for that," Dykstra added. "Knowing who should be in the room to make a decision, where is the data we might need to make a better decision, whatever the situation is even if it's an unpaid one we never anticipated. We can still benefit from the preparation process."
Business can also invest in training and awareness programs to inform the workforce on how hasty overreactions due to action bias can result in human error and actually exacerbate what’s already a bad situation. This might teach employees some "healthy skepticism in the heat of the moment," Dykstra noted. Likewise, senior leadership must also be educated so that they understand that responses to data security incidents will sometimes require deliberation before action.
Dykstra said that future defenses against action bias might include validation of potentially malicious emails or disinformation tweets to determine their authenticity, or perhaps the imposition of computer functions that alert and seek confirmation from a user before he or she takes a risky action.