A report earlier this month alleging that IT and remote access management software provider Kaseya ignored and mistreated employees who tried to warn the company of multiple security risks has demonstrated the vital importance of creating a corporate culture that rewards cyber vigilance and removes the stigma of human error, experts say.
According to the report by Bloomberg, former staff members whose identities were left anonymous said they were ignored and either decided to quit or were fired between 2017 and 2020 after reporting to management problems such as old and unpatched code and inadequate encryption. Kaseya would later suffer a major supply-chain ransomware attack in July 2021 that affected dozens of MSPs and their clients.
In a blog post this month, Dutch researchers at the DVID CSIRT reported that back in April they had privately disclosed to Kaseya multiple flaws in its VSA software, one of which was used in the REvil ransomware attack. But in contrast to the Bloomberg report, the CSIRT post said that Kayesa’s response to the disclosure was actually “on point and timely” compared to other vendors.
Still, it’s possible both accounts are true. Perhaps Kaseya took an outside security agency more seriously than its own internal employees. If so, then it is fair to ask whether the REvil attack would have happened if the employees’ warnings had been heeded, and whether Kayesa incurs additional liability for not being more responsive.
“A similar situation happened with the Target hack back in 2012 or so,” said Ira Winkler, president of Secure Mentem. “The security team reported that [a] FireEye tool detected bad things, and one of their multiple CISOs told them to ignore it, as it was a new tool. One person quit that I know of, sensing impending disaster. This happens way too often where employees are told they are just being alarmist and should ignore a problem.”
To avoid such scenarios in the future, companies must take steps to create a workplace that welcomes the reporting of vulnerabilities and mistakes, and doesn’t punish individuals for speaking up. “In the end, good management proactively creates such an environment, or at least doesn't discourage reporting,” said Winkler.
“There are very forward-leaning organizations today that are trying to de-shame reports of security incidents or risk,” said Tim Sadler, co-founder and CEO of Tessian. “They're trying to encourage employees [by saying]: ‘If you see anything wrong, if you have any concerns — you must raise them, and you're not going to be punished.’”
To foster this kind of environment, Winkler suggested to SC Media that companies institute a reporting infrastructure through which employees can share their findings — something similar to an “ethics reporting infrastructure” that many organizations have. “For most organizations, this should just be an extension of the reporting of safety, harassment and other violations or concerns,” said Winkler.
But for that to be truly effective, you still need to establish a culture of vulnerability-sharing, and that starts from top. John Hellickson, cyber executive adviser at Coalfire, cited the “importance of executive management's role in establishing and promoting a security-first culture.” Moreover, the CISO needs a seat at the board table “to discuss the overall cybersecurity program strategy and top risks within the organization.”
Moreover, “the CISO should ensure that the cybersecurity program is directly tied to the company's overall vision, mission and goals, and that a portion of the C-suite's incentive pay is directly tied to the success of security specific initiatives and associated KPIs that cyber risk is being managed appropriately,” said Hellickson. At the same time, the board must also hold top executives accountable for their role in managing cyber risk, he added.
Hellickson expressed particular disappointment over the Bloomberg report’s allegation that one employee was fired shortly after writing up a 40-page report on security issues.
If true, “the impression this gives to the rest of the employees — regardless of the actual reason the employee was fired — could create a culture of not rocking the boat on critical issues within the organization,” said Hellickson.
AJ King, CISO at incident response company BreachQuest, agreed that organizations “need to adopt a ‘don’t kill the messenger’ attitude when it comes to discussing and highlighting security issues, but he also acknowledged that “not every security warning is as dire as the individual reporting it may believe.” (Indeed, Hellickson even acknowledged that the fired Kaseya employee’s 40-page report “might not have been the best way to attempt to get an executive team to understand 'critical risks' in the organization.”)
To help companies prioritize the most important reported issues, King said that organizational leadership “should build boards of internal subject matter experts who can objectively rate the severity of an issue. This board must be empowered to prioritize security fixes over feature releases — or [else] negative PR is sure to follow in the case of an inevitable incident.”
Positive reinforcement and incentivizing employees also goes a long way to improving internal reporting culture. Winkler suggested a reward system “similar to the bug bounty programs people have… This is essentially a gamification program at its core and involves creating awards, but most importantly promoting the program. This may involve financial rewards or other acknowledgement that is meaningful.”
In a similar vein, Sadler advised that organizations seek out employees who are willing to volunteer to be trained as security champions who can lead their respective teams or departments in all matters of cyber hygiene.
“Even if you don't work in security as your day job, you can be the champion for your team, for your group, for the people in close proximity to you,” said Sadler. “I think the most successful kind of practices are where organizations have actually created their own security ‘swag’ or… security badges” that champions can earn for their hard work. Their successes at maintaining cyber hygiene efforts could even be folded into their overall performance review, he noted.
Sadler also suggested that companies could make employees feel better about coming forward with security concerns by openly disclosing and praising fixes and mitigations that were the result of a worker speaking up.
Still, it’s one thing to report a vulnerability or potential incident that someone else caused.
But what about when an employee needs to report his or her own mistake? For instance, perhaps a worker was tricked into clicking on a malicious link in a phishing email.
“I once worked with an oil company that wanted everyone to report all safety incidents, including near-miss incidents that could have resulted in injury, but didn’t. This included wanting people to report themselves, which was a hard sell,” said Winkler. But it was important because “the more information they had, the more they could proactively determine potential harm and figure out how to mitigate it.”
Sadler agreed that this is a tricky situation. Indeed, a June research report from Tessian revealed that out of 4,000 of surveyed working professionals, 27% said they opted not to report a security mistake they committed for fear of disciplinary action.
“That's kind of an overwhelming stat, and I think the answer to how we actually get people to… report mistakes, even if they're their own mistakes, is this idea of de-shaming right security incidents,” Sadler said.
“We need to have that empathy [and understanding] that people are going to make mistakes,” Sadler continued. “These things happen just because we are human. We make 35,000 decisions every single day, and sometimes on those decisions we get things wrong.”
Kaseya declined comment for this story.