(Keep feeling) Fascination
Social engineering works. Which is why threat actors take advantage of humans’ innate trust in others, inattention to detail, desire to be helpful, and lack of deep understanding about the mechanics of how social engineering campaigns are built by devious minds. To those working in information security, something like a simple phishing email might seem so very obvious. To others whose guards aren’t up 99.99% of the time, and who have one hundred of their own job-specific tasks on their plates, receipt of an email that claims to include shipping information—even if that person has not shipped a package recently—evokes curiosity. A phone call from a “helpless” person to the help desk—a role where helping is literally the number one requirement—is bound to spur the call recipient into an attempt to aid.
These human behaviors, though, are the same things that allow social engineering to succeed. To be clear, a person who falls victim to phishing is not dumb, careless, or deserving of ridicule or humiliation. What people need instead are education and clear instruction for spotting and dealing with social engineering attempts. Providing these tools empowers non-IT/security staff to work with security teams, against the forces of evil, lessening the probability that a scam will achieve its goal and harm your organization.
After all, often the human dupe is just the tip of the iceberg, the instrument through which the attacker is able to steal valid network credentials, take control of an individual’s workstation, or leave behind a juicy piece of malware for the user to spread throughout his/her system. When this happens, the security team is frustrated and the conned user feels guilty. (In a few cases, employees have been fired for falling for phishing, though this is generally not the right response.) Instead of hopelessness and irritation, try a few of these tips from experts who make their living by tricking people into doing things they shouldn’t.[i]
Make it real; make it relevant
Jason Wood, Vice President, Red Team, at a Fortune 100 financial services organization advises companies to promote security awareness with their employees. While it sounds so simple, Wood says that many of his clients are “amazed at the scams that are run by cyber criminals, and even more amazed at the scammers’ goals.” He iterates that most people want to be helpful and are generally trusting (unlike the skeptical infosec community), which is what lands them in hot water. “People are afraid of getting into trouble by not responding to urgent-sounding requests or demands,” he says, adding that “Scammers play on those emotions to get their victim to accept what they are saying.” Cyber criminals play an instinct-filled game that works, chiefly because security training organizations are not enabling users to be more self-sufficient.
Kevin Johnson, Founder and CEO of Secure Ideas, agrees. “Our biggest problem in security,” Johnson laments, “is that we tell users what to do [in security awareness training] but we don’t tell them why. Humans want to know ‘why,’ to understand what they are doing wrong so they know how to react.” He likens it to teaching his young daughters not to touch hot things: “We didn’t just yell at them and slap their hand away if they tried to touch the hot stove. We tried to explain to them that if they touched the hot stove they would get hurt.” It’s the combination of “hot” and “hurt” that Johnson feels is important, the same cause and effect scenario security teams should use in awareness training. “You don’t want to iterate through every badness,” like that the stove is hot, and the tea kettle is hot, and the pot with boiling water is hot, he says.
In this same vein, security teams can’t iterate through every “what if” when a user receives an email with a link or attachment. It’s the process of helping users make the connection between receiving a link, clicking on that link, and the potential security incident that may ensue that’s the critical part of training. Security awareness training today mostly does not focus on consequence other than to say, “we will get hacked!” While getting hacked is a consequence, it’s not personal and it’s ephemeral. Use personal stories and anecdotes to engage employees. “Share real-life use cases,” says Wood, reminding us that “We've got tons of news stories out there of people being conned and what happened to them. So, why use the dry awareness training that so many companies inflict on employees? Anonymize the victims, but tell their stories.” Wood says he’ll occasionally show YouTube videos of police or help desk personnel responding to then explaining the intent of a phone scam, which hits home for many users. It makes social engineering real and personal, without requiring the user to live the experience him- or herself. “There's so much real content available,” Wood says, “but our [current] awareness training generally sucks. Ditch the suck and tell stories that people will remember and are real.”
Don’t be a hypocrite
How many times do you, the security practitioner, send out instructions or security awareness information that includes a link or an attachment? Yet what is security trying to stop users from doing? Clicking on links and opening attachments. But it’s only suspicious emails with links and attachments, you say.
“We have to stop designing our systems and applications contrary to the way we want people to act,” Johnson criticizes, and rightfully so. Security awareness programs can’t train users to only trust the security team. If it’s from security, it’s alright? No. What if the attacker has spoofed security’s email address? And how about our third-party systems, like our CRM vendor or 401(K) provider? Every day people receive dozens (if not more) of emails with legitimate links or attachments, yet security warns people not to click on links and open attachments. Since these are legitimate emails—and users know this—companies are perpetuating the problem by using the same methods that are employed by threat actors. If we want to truly change behavior, we must stop confusing people (and convince our colleagues who send these emails and design the systems that send these emails to stop it, too). Johnson warns that “we’re setting people up for failure” when we tell users to stop acting in a way we’re continually promoting through everyday practice.
Give people the benefit of the doubt
When Johnson and his team conduct penetration testing or awareness training, he’s often told by the client contact, “Don’t test our users. We know they’ll fail.” This is crazy! Imagine, if you will, telling a toddler not to practice reading books because he’ll only fail. Stick to your picture books, kid. Or how about telling a young girl to stop trying to shoot a basketball into the hoop because she’s missed the last 20 times already. Give up and go watch TV. We’d never do these things, yet when it comes to training employees on security practices, we assume someone will fall for the phish, give up his password, or send over a confidential document, and throw up our hands in defeat.
Though employees may, indeed, act in a way security would not prefer, Johnson says every company must look at these instances as an opportunity to educate. If a person forwards sensitive data when asked, without question, go back to section #1—make it real; make it relevant—and explain to that person why the data was the “hot stove” and that the only consequence wasn’t just temporary soreness and a tongue lashing.
When a person fails a security test, provide results, explain why the action wasn’t the right decision, share what could happen in a real-life scenario, and offer alternatives. For example, the employee could be told to send a separate email—not a reply—to the sender, if known, asking if a legitimate attachment was emailed. Suggest that the user not provide information to the help desk when the help desk calls; recommend that the user call IT back at the number listed in the company directory.
Security training is about helping users and giving them the tools they need to succeed. If security is already convinced that users will never be able to manage social engineering on their own, the problem becomes its own self-fulling prophecy.
To mitigate social engineering threats, security needs to change people, “and that’s the hardest thing to do,” says Johnson. Wood concurs, “Victims don't have the mindset of a crook and aren't aware of what these crooks are trying to accomplish. If people aren't aware, they can't recognize when someone is trying to play them.”
Most of the time, people don’t even realize they’re engaging in risky behavior by clicking on link, holding open a door, leaving a workstation unlocked, or even by using “Password1” as their system password. People are trying to get through the workday and be as helpful as they can.
To counter social engineering, the security team must equip everyone in the environment with tools they can use. Start by explaining what social engineering is—even if you think people know—then share real, relevant stories of what happens when a person is conned into acting on behalf of the criminal. Use more than scare tactics to get your point across; share examples of attempted-attack-gone-right, too.
In addition, change your own methods of communicating with employees about what’s “bad.” If the security team can identify even one training or communication that includes a link or attachment, you’re setting your users up for failure. Train users to do what’s right instead of drilling into them what’s wrong.
Last but not least, don’t blame people if they don’t get it right, and definitely don’t assume they can’t be educated. People will learn, even if it’s slowly, and society is becoming more security aware over time. Social engineering works because it relies on humans being human. Teach your users as you would wish to be taught if you were learning something new and different. Give people the benefit of the doubt while providing the tips and tricks that will make people successful against cyber threats.
[i] N.B. These white hats are working for the good of mankind rather than the demise of the corporate workplace. Pentesters’ “crooked” deeds are enacted as a teaching and awareness tool rather than an opportunity for harmful exploit.