Tesla had one. Robocent had one. Walmart had one. GoDaddy had one. Misconfigured servers and databases in the cloud – exposing with critical information – are trending on the internet. In fact, they’re all the rage. At first we “oooo” and “aaah” at the criticality of the information exposed and shudder at the potential consequences if it had fallen into the wrong hands. In Tesla’s case, trade secrets were exposed. For Robocent, it was voter data. GoDaddy’s cloud configuration information was revealed for all to see. Details on 1.3 million customers of a Walmart jewelry partner were left wide open.

Then comes the inevitable chastisement of the companies involved – how can this happen yet again? It seems that after security researchers uncover a handful of open AWS S3 buckets or Microsoft Azure and Google Cloud databases, organizations would wise up and “batten down the hatches,” as Cloud Daddy founder and former NYC Law Department CIO Joe Merces advises.

The importance – and potential for compromise – of the information exposed to the public is breathtaking. Staggering really. And that most incidents haven’t resulted in a damaging breach or criminal action are strokes of luck and nods to the abundance of illegal ventures that preoccupy cybercriminals these days.

The security soft spot doesn’t lie in the cloud infrastructure itself, the experts say.

“I think the persistent problem is not because major cloud providers are inherently insecure,” Rich Campagna, CEO at Bitglass, maintains.

Nor are the bulk of exposures reported the result of malicious intent. Most, in fact, are the result of human error or perhaps straight up ignorance.

“If you provide a capability to a customer that they can make a mistake, they probably will,” Baffle Co-founder and CEO Ameesh Divatia says, noting that “at the end of the day” everyone makes mistakes. The sheer number of people who “touch” data in the cloud increases the likelihood of exposure.

“Amazon goes to great lengths to ensure that S3 buckets are encrypted by default.  In spite of that, configuration errors lead to situations where the data is exposed,” he explains.

And Merces says the persistence of open buckets is greater than reported.

“The problem is systemic, not for just large organizations,” he says. “If the big guys do it, what’s happening with smaller organizations? The exact same thing!”

Everything changed

While tales of open S3 buckets are more prevalent, or at least more prominent, the problem of open buckets and misconfigured servers is pervasive across all platforms and can be attributed to sweeping changes in development and operational environments.

A once dim view of the cloud as insecure and risky – less than a decade ago – has given way to, if not a full-on embrace, then at least an acceptance that the clould is necessary. Craving the flexibility and reach allow that the cloud gives them to touch customers, share information and roll out services more quickly, once leery organizations have rushed to the cloud at surprising pace. In fact, about 93 percent of U.S. businesses rely on cloud computing, with more than three million data centers operating nationwide to deliver cloud services, the Information Technology & Innovation Foundation (ITIF) says.

As a result, the pressure has landed squarely on developers to turn around apps and services more quickly at the same time that another notable change – from private to public cloud – has shifted the security equation.

“They gave the keys to the kingdom to developers,” who previously were accustomed to working in closed, controlled environments “under the watchful eye of IT,” Campagna says. While their intentions aren’t malicious when they inadvertently leave a bucket open or misconfigure a server, they often don’t have security on their minds.

“The development staff hasn’t learned what IT knows about security infrastructure,” says Divvy Cloud CEO Brian Johnson.

The trials and tribulations of cloud security are legion.

”When you migrate to the cloud, woes and security challenges more than double and not just because you’re running a private data center, too,” says Merces. “You have more to do with the added challenge of battening down the hatches without killing innovation.”

When everything was in the data center, development might have been more deliberate and plodding but security was easier, or at least more straightforward – invest in the right tools and voila! data and applications were locked down tight.

 “Often times these exposures are the result of developers and the test environment,” says Campagna. “Before it didn’t matter because it was in the data center. That’s changed a lot. Developers need real data to test.”

And, if they’re not well schooled on cloud security (hint: many aren’t), that can lead to a database left open or a server misconfigured, not realizing it until something untoward happens or an intrepid security researcher stumbles across it.

“Some don’t know if the data is public to public to their companies or public to the internet,” says Merces.

Further complicating matters, organizations typically use more than one cloud solution. In fact, IDC predicts, by 2021 more than 9 of 10 companies will use multiple cloud services and platforms.“Organizations use multiple clouds because they don’t want to put all their eggs in one basket,” says Merces. “But the learning curve is higher.”

AWS, Microsoft Azure, Google Cloud and others “are all different,” he says, making the environment a “more complex, very complicated problem” at a time when “attack vectors for bad guys out there in the cloud are increasing” with potentially devastating results.

Consider the great damage that a successful phishing email can do, Merces points out. “Just think if a user [victim] has unfettered access to the cloud.”

What to do, what to do

Cloud providers have invested quite a bit in securing the cloud itself. Open and misconfigured servers fall to the user organization or enterprise.

“Amazon’s shared responsibility model is very clear to point out that the protection of their customer’s data is the data owner’s responsibility,” agrees Divatia. “Amazon is responsible for the security ‘of’ the cloud but the customer is responsible for security ‘in’ the cloud.”

Merces concurs that “AWS has made it clear they’re responsible for security of the cloud hardware and underlying hardware resources.”

The customer is “responsible for what you put in the cloud and all the patches and updates. It’s up to organizations to protect data and manage access to the cloud, he says. “If you fail to do that, it’s on you not them.”

To minimize the risk of exposure protecting the data is the best place to start, the experts agree.

“Amazon continues to develop tools that can warn its users if data is left unprotected,” says Divatia. “While these measures are necessary, a better solution is to take control of your sensitive data as soon as it is created.”

Encryption is a centerpiece of that posture. “The fact that data is not encrypted is cause for a lot of the problems,” he says, noting that newer approaches “focus on the security of the data itself by encrypting it prior to the migration to the cloud,” guaranteeing “that the owner of the data maintains control even though they do not control the infrastructure.”

If an organization executes encryption at the record level, “you don’t have to worry, then it’s ok to leave the front door open,” says Divatia.

Encryption has met with resistance, mostly because organizations and the people who work there fear it.  “We hear the word encryption, implies control,” he says. “There’s a fear of losing the key and not getting data back.”

But the benefits of encryption extend beyond security. “It’s good for privacy compliance too – if the data is encrypted, you don’t have to report to Privacy Shield,” says Divatia.

Train and educate

As with anything security related, preventing data exposure in the cloud depends on education and training. “At the end of the day, everyone makes mistakes,” says Divatia.

So, it “boils down to properly training people,” says Merces, who notes that developers and others “don’t ask proper questions” about security because they don’t know what to ask.

Building in safeguards such as alerts that warn that a bucket is exposed can keep developers on a more secure path without putting the kibosh on innovation or messing up time to market. They “get an alert and shut down,” says Campagna. “It doesn’t have to slow them down.”

Johnson favors letting IT take the wheel and provide a watchful eye over development in the cloud. “Bring IT back into the mix,” he says.

Checking third-party security is critical as well. Many of the incidents involving open databases or misconfigured servers occurred on third-party sites, a painful reminder that organizations must do due diligence on the security of their partners.

Just ask Tesla whose sensitive information – along with that of VW and other manufacturers – was exposed after industrial automation provider Level One Robotics inadequately secured an rsync file transfer protocol server. “Who know how long the information was up there?” asks Merces, who also recommends organizations conduct pen tests and risk assessments.

“Have a third party do a risk assessment,” he says, though he acknowledges, “a lot of time teams inside haven’t jumped on bandwagon with security risk assessment.”

They might consider hopping on soon. The earlier in the process cloud security begins, the less likely an organization will fall victim to the latest fad – misconfigured servers – and find its valuable information exposed for the world to see.