Can security be better in the cloud? Experts believe it can, reports Deb Radcliff.
Cloud computing represents a technology shift that comes once every 20 years, says Jim Reavis (right), well-known security consultant and co-founder of the nonprofit Cloud Security Alliance (CSA). This shift presents an opportunity to do security better. It also presents the possibility of making matters much worse by pushing vulnerable applications and data into the hostile web.
“We’re talking about computing becoming a utility, which is the biggest change the internet has brought us to date,” Reavis says. “It’s not often that we shift computing to an entirely new platform. If we develop our infrastructures right, I believe enterprises will actually have more control in the cloud.”
Because of their use of virtualization, clouds offer the ability to do security on a massive scale and reduce costs, explains Nirav Mehta, senior manager at RSA, the security division of EMC.
However, cloud computing also introduces new risk to applications, data and the internal data center, which analysts predict will increasingly connect to their cloud components for synchronization.
There’s been a lot of confusion about what cloud computing actually is. Most of us are familiar with the public cloud, which Forrester Research defines as “pay-per-use hosting of virtual servers at service providers,” such as those offered by Amazon and Google. The second type of cloud is the internal cloud, which enterprises are developing with home-grown connectors and ‘cloud-in-a-box’ virtual network appliances, such as Eucalyptus Enterprise, IBM’s Cloudburst and Microsoft’s Azure. The third type of cloud is the hybrid environment, in which an enterprise distributes computer resources between public clouds and their private clouds to support load balances and specified tasks/functions. In this model, says Kristin Lovejoy, VP of security strategy for IBM, organizations keep sensitive information within their private clouds.
At the end of 2009, 82 percent of enterprise respondents to a survey from networking vendor F5 Networks were in some stage of public cloud trial or implementation, and 45 percent were already using internal clouds. These statistics support what Lovejoy says is widespread adoption of a hybrid model in which enterprises turn their infrastructure over to a service provider to support their software-as-a-service (SaaS) applications.
“Think of this as phase three of data center transformation, where organizations have gone beyond consolidation of platforms to virtualized systems to starting to manage these assets in a cyber cloud,” she explains. “Now I can roll out my tools and resources in the cloud as needed for a period of time, assign them to a specific person, then eliminate them when no longer needed.”
Risks and controls
At the time of this writing, the Cloud Security Alliance was near release of its version 2.1 cloud security guidelines, which define secure migration paths to private, public and hybrid clouds, as well as software- and infrastructure-as-a-service models. Of all the issues in the 80-page document, the most burning are access control and identity management, say Reavis and others.
Access and identity management in the cloud encompasses everything from federation and assigning of roles to multifactor authentication at the entry point — something most web-facing businesses today are still not using due to best-of-breed/scalability and user-comfort issues. Cloud storage provider Egnyte, for example, grants access to its network based on username and password, which attackers have already gone after with brute force methods – i.e., against Amazon, Google and other service providers, according to multiple reports. Egnyte CEO and co-founder Vineet Jain says moving to multifactor at this time is not possible for the above reasons.
“Username and password is not the safest way of accessing your corporate data, so you have to tie this level of access to open standards, including SAML and OpenID, to handle credentials securely,” he says. “You also assign access APIs down to the folder level that define basic permissions of read/write/delete — rather than setting the 19 permissions allowed in the Microsoft file server.”
When defining access controls for cloud computing, organizations must consider that their employees will be logging into the cloud, as well as into internal systems, for resources. Also, they must define controls that will not require users to log in and out again, advises Lovejoy. This means integrating cloud access to internal directories or copying those directories out to the cloud — both of which open these mission-critical apps to new risk.
According to Verizon’s 2009 breach report, 87 percent of breaches originated with applications, while multiple other reports show that web applications have become the top vector of attack.
“Imagine software written for internal use with all the trust mechanisms for users and putting that on the web with a web interface,” says George Hess, leader of the German chapter of OWASP and CEO and co-founder of the Art of Defence, a web application security vendor.
These apps have run internally with no testing for input validation, cross-site scripting or SQL injection attacks that are happening all over the web today, he continues. So internal apps need to be assessed rigorously before being put into the cloud, and then managed and maintained securely while in the cloud.
Data protection, particularly the use of encryption, is another critical control area that experts say must follow data as it moves from the organization through the cloud and back into the organization.
Like multifactor authentication, encryption, because of its reliance on keys, is notorious for being difficult to manage in-house. Players in the middle, such as key authorities, have been managing these functions in the cloud for a long time, offering a desirable extension into what Mehta calls security clouds that will link the customer to their software on-demand providers.
“In my mind, there’s a very loose connection between where keys are stored in the cloud, encrypted data is stored in another cloud, and the customer is the connection between the two clouds,” he explains.
Visibility is another critical control organizations need to consider when moving to cloud computing, says Bart Vansevenant, director of strategy and service creation for Verizon Business Global Services.
“Users of cloud services want transparency of the policies implemented in all areas of control. They want actual access to logs and events directly,” he says. “They want to be able to provision this server, their applications, databases and specific sets or subsets of security controls and access rules.”
In the most secure implementation of cloud computing, Vansevenant says customers use their multiprotocol label switching (MPLS) connections into the Verizon clouds to take advantage of on-demand computing resources. Small organizations with fewer resources, he adds, are the primary consumers of the company’s public cloud interface.
By turning the networking model inside out, cloud computing appears to represent the demise of the internal network, as predicted by Jericho Forum and others for several years now. Fortunately, there is a large body of experience, guidance and even available standards to coincide with — rather than chase after — this shift in computing.
“The cloud has become the next development platform,” says CSA’s Reavis. “There are real challenges ahead, but if we can get out in front of this, we have the opportunity of creating standards-based flexible frameworks that can be secure.”