For 25 years, The Weather Channel has been in the business of calculating risk. Each day, forecasters prognosticate which way a particular snowstorm will track, whether a drought-stricken region will see measurable rainfall, or if a hurricane will strengthen prior to landfall.
Even though it sometimes seems they are just shooting darts, meteorologists across the world use the best set of scientific data at their disposal to predict how fierce a given weather event might be and just what type of impact it will cause, while also accounting for uncertainty.
Their conclusions are neatly packaged into easy-to-comprehend graphics and percentages that convey the best possible predictions to the viewing audience — in The Weather Channel’s case, some 87 million households.
Meanwhile, behind the scenes at the cable network, a team of information security professionals is also gauging and predicting risk. It is the same concept that their on-air counterparts are applying, except that these employees have the burdensome task of preparing their data for consumption by upper management.
Instead of worrying about the chance of tornadoes and typhoons, the security group is crunching data and mapping out a vulnerability management strategy to assess IT security risk — because, in the end, The Weather Channel is just like any other organization trying to fend off cyberattacks.
“We’re a 24/7/365 shop,” says John Penrod, 38, chief information security officer of the cable network, which broadcasts from the Atlanta area, with its parent, Landmark Communications, based in Norfolk, Va. “Anytime we’re losing any of our weather products, we’re losing revenue.”
As compliance demands grow and headline-making hacker attacks increase in scope, business executives are demanding a new type of CISO, one who understands how IT risk plays into the bottom line and one who is able to justify spending.
As a result, IT pros such as Penrod are quickly realizing that the fastest way toward establishing that metric is to have a handle on network vulnerabilities. Visionary security players are warming up to the notion that vulnerability assessment does not mean merely probing the network for flaws — it means transforming those findings into actionable results.
“Vulnerabilities are an essential part of the risk equation,” says Michael Montecillo, analyst for security and risk management at the Boulder, Colo.-based Enterprise Management Associates. “Threat is very difficult to determine, but when you have vulnerabilities, that’s something that’s fairly straightforward to address. When you’re talking about risk, vulnerabilities are the easiest one to measure in terms of conducting assessments.”
Protecting weather data
The Weather Channel recently deployed the RedSeal Security Risk Manager to offer visibility into its threat profile. The tool helps the organization create a risk management policy, which sets a formula for determining action when increased risk is detected on the network, Penrod says.
“If we’ve identified a threat or an area of the network that needs additional protection, there is a mitigation tree we follow that says either correct the issue or, if you can’t, let’s go find some money,” Penrod says.
One may not consider The Weather Channel a typical hacker target — it lacks the mounds of sensitive data that a financial services firm or retail outlet may be storing — but it demands protection nonetheless.
As Penrod says, should an attacker gain access through one of The Weather Channel’s 10,000 servers and compromise the broadcast environment, from which things like radars and local forecasts and warnings are delivered, it could prove disastrous for the brand. More importantly, human safety could be impacted.
“When you consider our responsibilities…it’s very important that we make sure [weather data] is there and it’s always accurate,” he says. “If you’re sitting in Florida and there’s a hurricane coming your way, it’s important for you. To miss something like that could put people’s lives at stake.”
The RedSeal offering creates a type of “spreadsheet for security” to automatically map and assess the network infrastructure and prioritize vulnerabilities, while offering nightly penetration tests and ensuring compliance with government regulations, says Mike Lloyd, Redwood City, Calif.-based RedSeal System’s chief scientist. In The Weather Channel’s case, the product helps segregate the broadcast segment of the network, which runs a large number of legacy systems that create additional risk, Penrod says. The product also assigns a quantified risk score based on its findings, an aspect that is significant to executive management, Lloyd says.
Metrics are a new trend in vulnerability assessment products because they provide the type of information that people such as Penrod can use to express the IT department’s risk posture to the C-level suite — in easy-to-understand language.
“The improved communication from an IT organization to upper executives helps a lot,” Lloyd says. “It’s always a struggle for funding inside an organization. If you can actually demonstrate the risk to an organization…frankly, when you’re the CISO, you look good when you can do that. You look like you’re in control.”
Vulnerabilities on the rise
Few can dispute that IT risk management is growing in importance, as IT touches all areas of the business. It is how to calculate such risk that is causing the biggest headaches.
“Risk is potential damage to an organization’s value, often from inadequate management of processes and events,” says a 2007 Symantec study on the topic. “IT risk is emerging as a significant component of total business risk as IT assumes a more prominent role in organizations, and can account for more than 50 percent of total capital expenditure at some companies.”
Enterprise Management Associates’ Montecillo, who previously served as the first vulnerability management coordinator for an unnamed state government, says risk is the new compliance — and if a business has a handle on risk, chances are it is meeting compliance demands.
“Executive staff are starting to see where risk can put them in the red in terms of cost to their organization,” he says. “They are performing the due diligence to try and address those risks so they don’t end up as a TJX.” The Framingham, Mass.-based company that operates the T.J. Maxx and Marshalls clothing chains, suffered an online intrusion of its processing networks that the company admits resulted in the theft of data connected to at least 45.6 million credit and debit cards (other estimates double that figure). In November, TJX agreed to pay up to $40.9 million in a settlement with three bankers’ associations.
To avoid an exposure such as that suffered by TJX, a number of security firms have unveiled products in recent months that seek to help companies understand their IT risk. Still, a silver bullet for IT risk management is lacking, experts say.
“The problem with risk management is that it’s not yet a science,” says Jeremy Ward, service development director for Symantec Global Services, based in Cupertino, Calif. “The reason for that is because it’s a young and immature discipline. Financial risk management has hundreds of years of a track record. The data we’ve got to go on in terms of IT risk management is relatively recent.”
He says the answer lies in data sharing. One group, MITRE, a nonprofit with principal locations in Bedford, Mass. and McLean, Va., has been working since 1999 to maintain a standard for gauging risk in the context of network vulnerabilities.
Last year gave rise to about 7,000 unique vulnerabilities, says Steve Christey, principal information security engineer at MITRE, which maintains the Common Vulnerabilities and Exposure (CVE) list, a dictionary that provides the common names for publicly known security vulnerabilities.
Since 1999, MITRE has tracked some 28,000 vulnerabilities in packaged software. While the sheer number of bugs is certainly cause for concern, flaws do have one positive attribute: they provide a tangible way to assess risk, say experts.
“In comparison to the other kinds of things that you need to track when doing risk management, vulnerabilities are often concrete and actionable,” Christey says. “When you get to the board room, you can’t talk tech, but numbers are more understandable.”
Montecillo agrees, saying vulnerability management has become a key part of corporate governance and to the process of “operationalizing” security.
“There’s such a high number of intrusions [initiated] over known vulnerabilities,” he says. “From an executive standpoint, it’s known that this is an issue.”
Each CVE listing in the National Vulnerability Database, the U.S. government repository of standards based vulnerability management data, supports the Common Vulnerability Scoring System (CVSS), an open framework that standardizes the severity of vulnerabilities across heterogeneous platforms.
Version 2.0 of the CVSS, managed by the Forum of Incident Response and Security Teams (FIRST), was released earlier this year. It rates the severity of weaknesses on a scale of 0 to 10. CVSS takes into account three factors: the base score represents the constant characteristics of the vulnerability; the temporal score measures the possibility that the bug could change over time; and the environmental score accounts for characteristics in a user’s environment.
“CVSS is a way to provide a consistent risk metric,” says Christey, who sits on the standard’s special interest group. “All of the vulnerability scanning tools and all of the alerts will use their own definition of risk, so a consumer of this information, if they’re not using CVSS, might get multiple interpretations of how significant a single vulnerability is.”
These days, any robust vulnerability management platform is able to flag unauthorized network alterations and ensure critical network components are properly configured.
As the flaws increase, making it more and more difficult to catch every network weakness, many organizations are turning to change and configuration management as part of their defense-in-depth strategy, Montecillo says.
“A configuration management program sets the standard for what’s acceptable for systems,” he says. “It’s a cost savings and a time savings. You don’t have to assess and find out there’s ‘x’ number of vulnerabilities. You have a baseline for what’s already acceptable.”
Christey adds that protecting against known vulnerabilities is important to defend against targeted attacks, but focusing on system configuration is equally critical. For example, he says a company may have deployed all the latest patches, but if a user’s machine is mistakenly running as an administrator, the unauthorized privilege level could permit an attacker to drop a piece of malware for which there is no signature.
Change control, meanwhile, allows the security team to learn whether any changes — either by a malicious outsider or, more likely, an accidental insider — created a vulnerability.
Back at The Weather Channel, Penrod agrees that network operations and general infrastructure visibility are the most important factor to consider when determining risk.
“Our driver is to not end up on the cover of the Wall Street Journal,” he says. “We’re just looking to be able to pull all of this data together and see what the network looks like. What we value most is brand loyalty and trust.”
GET A GRIP
What leads to risk?
The number and types of vulnerabilities in the network, both client-side and server-side, may be one of the easiest ways security administrators can get a handle on the health of their operating infrastructure. But they are not the only places to look.
The SANS Top 20 for 2007 lists four other major risk categories:
- Security policy and personnel, including excessive user rights, phishing and unencrypted portable devices.
- Application abuse, including instant messenger and peer-to-peer programs.
- Network devices, including VoIP phones and servers.
- Zero-day attacks, for which there is no patch.
— Dan Kaplan
The development phase
Some entities are working to proactively minimize risk as it relates to vulnerabilities, in an effort to help businesses.
Nonprofit research organization MITRE is overseeing the Common Weakness Enumeration (CWE), a dictionary of common mistakes made when developing software, such as buffer overflows or cross-site scripting.
The initiative, which kicked off about a 1 1/2 years ago and is starting to gain momentum, is a natural offshoot of its eight-year-old Common Vulnerabilities and Exposure project, says Steve Christey, principal information security engineer at MITRE.
“We found that many programmers make the exact same kind of mistakes, regardless of what kind of software they’re developing,” Christey says. “CWE starts to catalog those common mistakes that get made.”
The hope is that the CWE lexicon can serve as a reference guide for software developers, says Robert Martin, principal engineer at MITRE, which operates three federally funded research and development centers for the federal government
“There are specific things that people can look for,” he says.
Meanwhile, Djenana Campara is leading the Object Management Group Software Assurance Special Interest Group, a nonprofit seeking to establish a common framework for expressing trustworthiness in software.
The government-sponsored initiative is doing its part to mitigate risk for organizations by creating a common standard that can be used by tool vendors to develop products that sniff out vulnerabilities in the software development stage.
“There’s no way one vendor or one tool can address complicated issues,” Campara says. “The software systems today are very complex, from the technologies to the platforms they can run to the languages that are run. All of that has to be taken into account.”
The model — which is awaiting acceptance by the International Standards Organization — also can be put to use by suppliers to express trust to consumers, and by organizations that want
to assess the risk a piece of software introduces to their environment.
“Microsoft and Oracle are running security scanners,” Campara, chief executive officer of KDM Analytics, explains. “Nobody knows how good they are. Nobody tested these tools. What capabilities do these tools have? You don’t know. I don’t know. Nobody knows.”
— Dan Kaplan