For the last few years, companies have talked to me about their concerns and fears around cloud security, particularly public cloud models. Yet, we’ve reached a point now in which drug companies are running drug trials in the public cloud, the government is beginning to leverage the cloud, and providers like Amazon Web Services can demonstrate that their cloud is actually more secure than an enterprise’s individual data center. After stripping away the emotion and almost-religious arguments about where data should be, it’s clear that concerns around security are more of a perception issue than an actual problem. 

Rather than easing up, however, security conversations have raised a new concern: data sovereignty. Essentially cloud security plus politics, data sovereignty regulations require that companies keep confidential data in its country of origin or severely restrict housing it outside the country of origin. In most cases, these laws are being independently designed by and individualized to countries, similar to the way a nation has its own currency. Created to protect citizens, these requirements represent a huge barrier to the public cloud. Some nations are working together to overcome this obstacle – for example the United States and the European Union established the Safe Harbor Act – but, in general, much of the world is still struggling with the issue. In Canada, for instance, some laws require that data not just stay within the country, but within specific provinces and territories. 

“We’re three to four years away from a solution that can work on a global scale…”

– Joe Coyle, VP and CTO at Capgemini U.S.

As a whole, data sovereignty has created a cloud Catch-22. Cloud is supposed to simplify the procurement of IT, cut costs and deliver flexibility. But for a company that is looking to leverage the public cloud on a multinational scale – which in my experience, is most – costs can easily go up and flexibility down. Enterprises facing this regulatory challenge will have to embrace the hybrid cloud, find a way to work with multiple localized cloud providers, and roll up this data in a way that satisfies all of the various regulations around the world. We’re already seeing localized providers cropping up around the world to fill this void. Large providers are forced to find a local partner, acquire a local provider or build a center in the country to meet the required standards and continue operating at a global scale.

With the European Commission’s recent publication of its cloud strategy, this issue is now coming to a head – and something has to give. The three basic things that we need in order to overcome data sovereignty issues are as follows: First, we need clarity around the rules specific to each country. Companies would also be well served to establish an internal department that can dedicate the time and effort to interpret and enforce these mandates. Second, data targeted for the cloud should be architected assuming worst-case regulatory scenarios in order for organizations to protect themselves. This means interpreting any ambiguity in the regulations to be overly strict. In other words, if in doubt, keep it within the country or regional boundaries.Then, as the rules are ironed out, they can hopefully collapse and improve the architecture for both cost savings and access efficiencies. Many of the largest cloud providers today have been looking at data sovereignty from the start of building out their clouds – which is exactly what we need to be doing. 

Finally, cloud providers must use their largest assets – their customers – and work together with the government to find a solution. We’re three to four years away from a solution that can work on a global scale, but we must continue to make strides toward resolving issues around data sovereignty in order to fully leverage the potential of the public cloud.