A smarter worker is a more secure worker, says Theresa Masse, Oregon’s CISO. Dan Kaplan reports.
Information security is a challenge across industries, but arguably no vertical has more personally identifiable information to protect than government. In fact, government agencies typically are swimming in the confidential data of the large numbers of taxpayers who they serve. But that’s where a delicate balancing act comes into play, because, often, government workers’ jobs center on interacting with the public and responding to their requests for information.
“That’s why state government is here, to serve the people of the state,” says Theresa Masse, CISO of the state of Oregon. “We want to be helpful. We’re here because of their tax dollars. We want to make sure we’re giving the highest level of service that we can. [So] people tend to be helpful. [But] it’s important to realize that when it comes to confidential information, we have to be careful what we’re giving out and who we’re giving it to. We have a responsibility to protect that information.”
Masse, 59, who has served as Oregon’s security chief for the past seven years, says that because government employees tend to share personal information with citizens more than most organizations do, the threat of an insider-caused breach is ever-present. And with 58,000 employees operating across 110 agencies, boards and commissions, it’s easy to understand why Masse views the Beaver State’s workforce as the first – and often, last – line of defense against breaches.
And the threat doesn’t merely reside in Oregon state employees’ handling of sensitive information – such as data related to unemployment or welfare benefits – but also in the possibility that their actions may open the door to an external adversary.
It’s not that far-fetched a scenario. In October, hackers raided the bank account for the city of Burlington, Wash., making off with $400,000 after city computers were compromised to steal login credentials. The heist hijacked the direct deposit account information for a large number of municipal employees, and the perpetrators’ identities remain unknown.
As such, it takes just one hacked endpoint for a financial disaster to be set in motion. And with attacks becoming more sophisticated and so-called disruptive technologies, like social media, mobile devices and cloud computing, becoming commonplace, attacks that succeed via the mistake of an employee are more of a reality than ever.
“Phishing and social networking have enabled external folks internal access through employee accounts,” Masse says. “This often is more difficult to detect, as employees have legitimate access, frequently to very confidential information, as part of their job functions. So whether employees are misbehaving or their accounts are compromised by an external source, state information is very much at risk.”
According to a survey of 182 IT security professionals, conducted at this year’s RSA Conference in San Francisco, one in five selected external threats as their greatest security risk. Compare that to 27.5 percent who cited insider threats.
“While industry focus naturally gravitates toward the latest buzzwords, such as ‘advanced persistent threats,’ we were pleasantly surprised to find that practitioners primarily voice concerns with how to better manage security,” said Nimmy Reichenberg, vice president of marketing and business development at AlgoSec, a provider of network security management, which conducted the survey. “Poor visibility into what is occurring in the network, insider threats and poor processes that result in out-of-process changes are responsible for much of the day-to-day risk. Regardless of the latest attack vector or breach that makes headlines, it all goes back to strong security processes, visibility and control.”
State of Oregon: By the numbers
Although according to breach repositories, data-loss incidents caused by external adversaries traditionally have trumped those committed by insiders, studies show that insider-enabled events often are underreported – and can lead to significant brand and reputational harm. Carnegie Mellon University in Pittsburgh, which earlier this year studied 80 cases of insider fraud in the financial services industry, found that “low-and-slow” acts of insider fraud are costing organizations an average of $382,000. The study, funded by the U.S. Department of Homeland Security, turned up some potentially surprising tidbits, including that managers and accountants are causing the most damage, and that 93 percent of cases were carried out by someone who didn’t hold a technical role in the breached entity.
“An employee who instigates a malicious act is much more of a concern,” Masse says. “Presumably, they have more intimate knowledge of the state’s or their agency’s security mechanisms, and can tailor their efforts to avoid discovery. These folks are much harder to detect and potentially can compromise a significant amount of data over a much longer period of time because they have legitimate access.”
In an effort to avoid this type of fate, the state of Oregon has focused its protection efforts on defining data, logging events and controlling access. But Masse also has one other trick up her sleeve to get employees more security minded. Think of it as a version of the Department of Homeland Security’s “If you see something, say something,” campaign, but applied for information security.
“You’re not going to be penalized because you brought something forward,” she says. “It happens. It’s water under the bridge. Accidents happen. Life happens. That’s OK. Let someone know so appropriate action can be taken. It may be nothing, but we’d rather investigate than have it simmer.”
Is training worth it?
Most security experts believe comprehensive policies and robust user awareness training are critical underpinnings to any organization’s security program.
“There’s a tendency to blame employees, as if they’re somehow at fault.”
– Mark Johnson, chairman, Risk Management Group
And when it comes to organizations sustaining malware infections, the source of the attack is often an unwitting user who clicks on an attachment or link that they shouldn’t. As such, the theory goes, if the user were properly trained to spot attempted network intrusions, many of today’s most devastating breaches could be stymied.
Many organizations have taken that to mean they must invest in security awareness programs, which vary in shape and size, but commonly take the form of users passing an annual exam to validate that they aren’t going to click on that legitimate-looking – but malware-laden – attachment. On the surface, it makes sense to implement these types of programs, especially considering a number of regulations and industry requirements mandate such training.
But, because breaches remain so regular – even at companies that specialize in data defense, like RSA – at least some industry gurus question the effectiveness of user education. Recent debate was sparked by an opinion piece, written by Dave Aitel, CSO of security firm Immunity, which ran in the July issue of CSO Magazine. He argued that it’s a “myth” to believe that employee training actually works, citing the RSA breach, in which a worker clicked on a malicious attachment that ultimately led to a major breach of intellectual property.
“A user has no responsibility over the network, and they don’t have the ability to…protect against modern information security threats any more than a teller can protect a bank,” Aitel wrote.
The column ignited much discussion, including from well-known privacy researcher Adam Shostack, who contended that awareness programs should only be written off if organizations determine they’re not worth the investment. To accomplish that, he challenged companies to develop more reliable risk metrics.
“Opinions, including mine, Dave’s and yours, just ain’t relevant in the face of data,” Shostack wrote. “If you’re outraged by Dave’s claims, prove him wrong. If you’re outraged by the need to spend money on [training for] social engineering, prove it’s a waste.”
Mark Johnson, chairman of the U.K.-based Risk Management Group, a consulting firm, sides with the belief that training employees probably isn’t worth it. Considering the emergence of BYOD, social media and cloud, Johnson would instead like to see end-user organizations demand more of their providers.
“There’s a tendency to blame employees, as if they’re somehow at fault,” he says. “What we’re looking at is a democratization of [mobile] devices. Employees are deciding what to install, which network to use. They’re acting as if they’re system administrators, and most of them haven’t been trained for that. I think the responsibility of telecom operators and manufacturers of devices is to put strength in the hands of users. It would seem that more could be done, given our dependency [on their services and offerings].”
And the problem will only grow, he calculates, considering that risk is a function of the number of devices, vulnerabilities and malware samples that are present, all of which are growing at staggering rates.
“The only people in any position to deal with the problem are the ones who manufacture the goods and operate the network,” he says.
Doug Jacobson agrees that most training exercises border on the insipid, especially resources such as “Top 10 lists,” which tend to oversimplify responsibilities by failing to call on employees to critically think about potential security incidents beyond the obvious or common. But, Jacobson, a professor in Iowa State University’s Department of Electrical and Computer Engineering, says he is not giving up on the importance of training. That’s why he’s helped launch the Information Systems Security Laboratory at the college in Ames.
Billed as a first-of-its kind effort, the for-profit center will offer training, product testing and outreach specifically geared for IT workers – not security professionals – who are employed at small and midsize businesses in Iowa and across the Midwest.
“They play a critical role in the way the organization operates,” he says. “They’re the ones who often are in charge of a lot of the infrastructure. If they don’t have a good handle on security, they may see things they may not know they’re seeing. IT staff needs to be aware of threats. They’re not going to be able to go out and buy top-of-the-line [products].”
In Oregon, like any government, especially one operating in a recession, budget dollars for security – and across the board for that matter– are at a premium. Stuck in a spending quagmire for several years, Oregon has faced job losses and has been forced to institute mandatory employee furloughs, says Masse, who oversees the state’s enterprise security office, which is responsible for developing policies, standards and guidelines for all of Oregon’s government agencies.
Masse admits that an employee base of nearly 60,000 presents an unpluggable exposure. But she sees inherent value in training workers, especially on the fundamentals – like not sharing one’s password and not clicking on suspicious links or attachments.
Aside from enterprise directives that each agency must have an information security training program in place, Masse’s office also has formed a 25-person information security council, which meets on a monthly basis to discuss critical issues. The state also leans on a number of no-cost initiatives, such as the Multi-State Information Sharing and Analysis Center (MS-ISAC), which provides complimentary materials, such as literature and webcasts, that can be shared with all of the state agencies. In addition, Oregon recently participated in a federally run two-day exercise known as Cyber Storm, which forced key personnel in various state departments to engage in and respond to simulated cyber incidents. The state also has established a Federal Tax Information Committee, which includes members from various departments that handle highly confidential tax data – such as the Department of Revenue and the Department of Human Services.
Ultimately, educating trusted insiders is just one tool in any organization’s security arsenal, Masse says. And even if workers are schooled in security best practices, there’s no way to guarantee that there won’t be a bad apple among the bunch. That’s why the only rational tactic to take is to consider and present the threat in business terms, and work to mitigate the risk.
“We focus on the business part of it, and that helps [employees] to digest and grasp it,” Masse says. “I think it’s resonating.”
[An earlier version of this story incorrectly stated that the city of Burlington, Vt. was hacked. It is actually Burlington, Wash.]
Pat-down: TSA and the insider threat
The Transportation Security Administration (TSA) deals with its fair share of criticism – such as long lines, controversial screening practices and a record noticeably absent of terrorist apprehensions since it was formed following the 9/11 attacks. But while its mission is to protect airline safety, it certainly has plenty of confidential data of its own to defend, and has been forced to deal with a number of insider-caused breaches, both cyber and physical. For example, in 2010, a former TSA employee was indicted on charges of planting malicious code on a server, which contained data about suspected terrorists that was used to vet airport workers
We asked the TSA’s CISO, Jill Vaughan, to explain the agency’s strategy to deter cyber threats that arise from within its expansive employee ranks, nearly 60,000-strong.
SC: What best practices does the TSA have in place to deal with insider threats?
JV: TSA performs on-site assessments and training at TSA locations nationwide and has developed in-house tools to log and assess business communications and mitigate insider cyber threats. TSA also uses network-based data and business intelligence to identify activities of interest within TSA, and identify network locations that require additional monitoring and scrutiny. TSA also implements tests of IT systems, specifically focused on insider threats.
SC: How do these best practices differ depending on the threat?
JV: These best practices raise awareness and provide communication mechanisms for employees to report suspicious activity, building human insider threat monitors that can report on their observations. Additionally, identifying vulnerabilities allows leadership to take actions that reduces risk.
SC: Given the sensitivity of TSA’s duties, how are employees vetted and controlled when accessing the network?
JV: Each user’s credentials are verified every time the system is accessed, and additional testing is conducted to ensure employees apply proper protections when accessing TSA networks. Access to different networks is limited based on the specific work area of individual employees.
SC: The TSA works with many contractors. What policies are in place to ensure security on this front?
JV: TSA builds IT security into the framework for appropriate contacts to ensure compliance with government standards and best practices.