Failure must be a part of the plan
In his Harvard University office, Richard Clarke is chatting about the day's headlines with another adjunct lecturer at the institution's Kennedy School of Government. The conversation seems a warm-up to an interview he has later that day with ABC News as its on-air consultant. He is to comment on a report about the mistaken release of a National Planning Scenarios document created by the Department of Homeland Security (DHS) detailing hypothetical ways terrorists might attack the country.
For now, Clarke is focusing on overall internet threats, one of which touches on the role DHS plays in helping to protect the critical infrastructure. But the most domineering of these IT security dangers afflicts all organizations – the prevalence of software vulnerabilities.
To correct these, software vendors together must work to industry-wide standards granular enough to make an impact, says Clarke, former special advisor to President Bush for cyberspace security. "The reason people hack their way in, the reason there are worms and viruses, all comes back to mistakes in code that they are abusing," he says.
Compounding the problem is federal government's apparent inability to take a strong leadership role, offering guidance to private companies on information about new attacks, reconstitution after combined virtual/physical events (called swarming), or improved coding practices, says Clarke, who is also founder and chairman of Good Harbor Consulting.
If company systems were supported by "decent code in the first place, hackers couldn't get in," says Clarke. "Why don't we go after the cause while we're dealing with the symptoms? We need to have the best experts in the private sector, the universities and the government get together and create a set of best practices and standards for code writing."
Microsoft, Oracle, SunMicrosystems, Apple and others, leading universities and the National Security Agency (NSA) and the National Institute of Standards and Technology (NIST) should develop and agree to these standards.
"Then we can [have] auditors come in and ask companies if they are really living up to those standards," he says. Based on this information, he continues, CSOs could then purchase the safest software.
But most vendors are working now to ensure security is part of their software development lifecycle, says Amit Yoran, former director of the DHS' National Cyber Security Division (NCSD), the group formed after Clarke left the President's Critical Infrastructure Protection Board (PCIPB).
"Many large software firms are quite savvy in their understanding of software vulnerabilities and are evolving their development practices to improve security. NSA and NIST have a number of activities under way to work with the development community," says Yoran.
SunMicrosystems, like Oracle and Microsoft, is working with government agencies and other institutions to find better ways to secure cyberspace. "Proactively architecting secure, flexible infrastructures based on secure planning and design, rather than responding reactively" [to a vulnerability] is the best route, says Barbara Kay, director of security marketing at SunMicrosystems.
The issue, though, is that many software-coding processes are proprietary, which moves away from establishing an agreed-upon baseline that is practiced by all vendors and then tested by auditors, says Clarke.
For example, in March, Microsoft announced new measures to design, develop and test internet-facing products to lower vulnerability counts. The Trustworthy Computing Security Development Lifecycle white paper outlines the procedures Microsoft now follows to strengthen security in code. But it has not yet released the details of these standards, says Clarke.
SystemExperts' vice-president Brad Johnson says more problems than lack of open standards plague networks. "Even if coding practices were completely addressed in some fantasy world where there were no bugs in software, it doesn't change the fact that security is hard, because integrating products from different places is extremely difficult," he says.
Even small businesses have software and hardware from numerous vendors that is layered with various applications. These multiple integration points "create some of the most significant vulnerabilities. And things don't stay the same. Keeping those things in sync is where a lot of problems lie," he adds.
To better address all these problems, companies would be assisted greatly if the federal government more effectively disseminated consolidated information on exposures and attack methods, says Clarke. With many CSOs feeling government fails to work with businesses in fruitful ways, officials must take an active leadership role that spawns ongoing and practical initiatives, he adds.
But his former co-chairman of the PCIPB Howard Schmidt, now chief information security officer with eBay and chairman of the U.S. Computer Emergency Response Team, thinks the government is actively engaged with private businesses.
"When Dick and I ran the office at the White House, we did some 14 town hall meetings [on various cybersecurity efforts]. We were very, very public about this... Now, we've got the support... People have bought into it, so now it's down to the nuts and bolts of how to [fulfil goals set]," he says.
However, while government plays a role in helping to secure the country's critical infrastructure, the "vast majority" of the work must be done by the private sector, says Schmidt.
DHS officials did not respond to interview requests, but SC's Marcia Savage recently reported that DHS is building a national response plan and is stressing to companies the need to integrate physical security and cybersecurity.
"We're doing what we can, but cybersecurity is not just government's responsibility," insists Hun Kim, NCSD's deputy director.
Clarke agrees that private companies need to do a better job. Strong starts would include drafting and practicing comprehensive business continuity plans and following a governance structure that gives corporate leaders collective responsibility for virtual and physical security needs.
With most of the nation's critical infrastructure owned by private companies, part of the onus falls to companies' C-level executives to be more proactive about security. "The first thing that corporate boards and C-level officials have to accept is that they will be hacked, and that they are not trying to create the perfect system, because nobody has a perfect system," he says.
In the end, hackers or cyberterrorists wanting to infiltrate any system badly enough will get in, says Clarke. So businesses must accept this and design their systems for failure. This is the only sure way to stay running in a crisis. It comes down to basic risk management and business continuity practices.
"Organizations have to architect their system to be failure-tolerant and that means compartmentalizing the system so it doesn't all go down... and they have to design it in a way that it's easy to bring back up," he says.
Relying too heavily on perimeter security and too little on additional host-based security will fall short, says Clarke. Organizations, both public and private, need to be much more proactive in protecting their networks from internal and external threats. "They spend a lot of money thinking that they can create a bullet-proof shield," he says. "So they have these very robust perimeters and nothing on the inside."
Clarke says that after using intrusion detection/prevention at the desktop, and network-levels alongside advanced anti-virus solutions, companies should also consider identity-based access controls, two- or three-factor authentication to access networks both inside and out, and additional software to detect anomalous network behavior.
The events of 9/11, which saw Clarke activating the entire government continuity plan for all federal departments, taught him that business continuity planning needs to go much further. "The biggest problem with business continuity is that people assume... the regular guys who do the job day-to-day in the headquarters will, in a crisis, make it to the alternate site," he says.
But on 9/11, "none of the [government] departments could get to their alternate sites. It took them 12 hours because all the roads were jammed. For continuity plans to work, you have to have a warm, if not hot, second location," he says. Additionally, full-time employees must be close by or have a guaranteed way of getting to that secondary site to re-establish business.
To achieve these and other mandates in a crisis, contingency plans need to be developed with input from the people who actually do the work, he says, leading to the creation of "lessons learned." This helps to focus attention on areas where additional training is required or where the company has failed to plan. From here, companies should establish an "action list" and timeline detailing the areas that need to be fixed, then actually correct them before a crisis hits.
Most importantly, organizations should follow a governance structure that brings together on a regular basis the major decision-makers in the company. This "corporate security council," also championed by Schmidt, must include chief financial officers, chief operating officers, HR officers, general counsels, auditors, CSOs, risk-management officials and other business leaders who will be responsible for managing the company's cyber and physical risks. The group then collectively decides on the risks that the company is willing to accept, and locates funding, resources and support for the ones they are not.
"They have to take collective responsibility for corporate security, physical and cyber," he says. "With that governance model, everything follows."