“Our delivery model is provisioned the same way that Salesforce.com's is – software that's hosted centrally in a shared environment that can be leased,” says Mark Knudsen, vice president of web solutions at ThomasNet, based in New York.
ThomasNet's online cataloging application helps its 3,000 SMB industrial users configure guided shopping selections for their business-to-business clients. Its computer-aided design (CAD) publishing component helps operators design their wares. And, its e-commerce component includes a secure shopping cart, card processing and shipping calculator.
Problem is, not all of these applications are actually ThomasNet's. Payment fulfillment, for example, is handled by a third-party cloud service provider named 3Delta Systems, which runs five cloud computing applications for secure processing and data storage in a shared environment.
This sharing between applications providers represents the common cloud model: Clouds within clouds connecting service providers together on their back-ends. Meanwhile, on the front end, organizations are moving sensitive and regulated data into multiple cloud applications with little or no accounting of how security of and access to that data is carried out.
“Organizations I've talked to have between five and 20 in-cloud applications that they use for business,” says Chenxi Wang, principal analyst at Forrester. “This means more obscurity of data-handling features, particularly when data is crossing inter-structural boundaries between cloud service providers.”
Larry Ponemon, chair and founder of the Ponemon Institute, dubs the creep in cloud applications as the “mushroom cloud” syndrome. “Cloud computing has touched all organizations, including large enterprises that often don't know some of their divisions are using cloud applications,” he says.
As sensitive and regulated data move into the cloud, the specific security issues that follow include authentication, access controls, encryption, data leakage protections, and regulatory reporting across multiple geographic boundaries, says Bret Hartman[D1], chief technology officer at RSA. Virtualization security also comes into play, he adds, given that many cloud providers use virtualization technologies to reallocate applications and disk space.
With RSA also being in the business of in-cloud storage and security services, Hartman and his engineers have been ruminating about how to make these components securely interoperable across the service provider cloud, and across multiple clouds chosen by the consumer in a best-of-breed fashion.
“That's the challenge of security in the cloud. It requires interoperability and some of the standards that are stable and well-established, particularly in the area of federation, and work well with the standard web interface,” Hartman says.
Federated-type single sign-on models based on SAML (Security Assertion Markup Language) and OpenId are positioning to manage the authentication problems between users and cloud application providers.
For example, OpSource, an infrastructure services provider focusing on cloud integration, uses TriCipher's SAML-based MyOneLogin for secure access to 250 applications it hosts for its business users through an OpSource program called CONNECT.
TriCipher, through MyOneLogIn and other applications, provides business users anywhere access, VPN and multifactor authentication, and user management for 200 in-the-cloud business applications providers, including Google, Salesforce.com, ADP, Concur and others.
“Authentication on a user level – we've got that one nailed down,” says Treb Ryan, CEO of OpSource[D2], based in Santa Clara, Calif. “Locking down data transactions is the most challenging thing developers are dealing with now because the data has to travel over the public internet – and because of the nature of these mashups the data transacts with.”
By mashups, Ryan is referring to the APIs (application programming interfaces) made by different developers to tie cloud applications together to extend their service offerings. So, for example, with integrated API's users can work between Salesforce.com's ERP system, Google's Spreadsheets, and Amazon Simple Storage Service while conducting their business.
“What I see is a fairly rich cloud ecosystem with lots of providers offering something you might want or need for your applications, seamlessly, so the user on the front-end isn't aware that they're pulling data out of five different systems,” says Craig Balding, a Hungarian-based IT security practitioner and author of the blog Cloudsecurity.
Until now, these back-end APIs have been proprietary, resulting in inconsistent data protections across applications providers and programming nightmares for integrators, say Balding.
But now, says Balding, a new protocol, called OAuth (open protocol for secure API authorization), is being quickly adopted among developers because it allows for more granularity of data controls across cloud applications.
OAuth was formalized in December of 2007, with Google and MySpace among the first to adopt it as an API development standard in June of this year. Yahoo and others have since followed, but many large cloud service providers still have not announced support for OAuth.
“It's still in the early days, but some Google Apps customers have indeed started to use these APIs to track some changes they make to the data Google hosts on their behalf,” says Eric Sachs, product manager for Google Security. “We also believe OAuth holds great promise to enable even more secure transactions between service providers, which is why we moved from our proprietary version of delegated authorization to the standardized OAuth.”
Policy and reporting
Google, which also supports a number of SAML-based APIs for single sign-on, provisioning and email system migration, has already developed reporting APIs that can be used to monitor and track accounts, activity, disk space, email clients, quotas, summaries and suspended accounts.
Reporting on activity like this is particularly important for forensics followup and to report on regulated data, says Forrester's Wang and other analysts. Wang warns that not all service providers offer this – or any – level of reporting around sensitive data across their clouds, and that such functionality should be considered when acquiring services in the cloud.
ThomasNet, too, offers reporting capability. Users of its system get web-based dashboards with which they can access session level reports, activity reports and even pull log data out for forensics investigation. But when it comes to auditability of the cloud networks themselves, experts say regulators would have a hard time of enforcing audits along the service provider chain.
For now, says Ponemon, users must rely on the assurances of provider organizations, with no way to test their controls as they would for internal applications.
“In a multi-tenant environment, if there are service issues for one, there are service issues for everyone,” says Kay Dormer, a salesforce.com spokesperson. “Our customers understand this. Very few request service level agreements because they understand that we have all our efforts focused on running a reliable service.”
The one thing that's most controllable for user organizations, says OpSource's Ryan, is in developing and enforcing policies, starting with what data should and should not be allowed into the cloud. If some of that in-cloud data is risky, then ensure that security controls are guaranteed by the primary service provider.
“Cloud computing is a huge issue for operations teams today – a true megatrend,” surmises Ponemon. “The bad guys are looking for the weakest link in the chain to get at data stores in the cloud.