Encryption is better equipped than tokenization to secure data in the cloud.

FOR

Ian Huynh, VP of engineering, Hubspan

Think of end-to-end encryption as purchasing a comprehensive insurance policy for all aspects of your life. With this insurance, no matter what happens, you are covered and secure in knowing that nothing has been omitted. Likewise, with full encryption, your data is completely covered – secured from the origin and as it travels to its destination, both at rest and in motion.

Another data protection approach many companies use is tokenization. While still a valid security approach, tokenization could omit sensitive data that should be protected. Since data requirements change over time, full encryption gives the assurance needed to confidently operate your business without the fear that some data element was inadvertently omitted.

There are some concerns that end-to-end encryption can have a negative impact on performance. In most cases, the performance experience using full encryption is acceptable, and the benefits of lower risk far outweigh any performance hit, especially when it comes to protecting critical data.

AGAINST

Ulf Mattsson, CTO, Protegrity

One of the biggest concerns about the cloud is the threat of data being stolen. Next-generation tokenization is a better option for securing data in the cloud than encryption because it is transparent, faster, more secure and more scalable.

The cloud is a high-risk environment that decreases administrators' ability to control the flow of sensitive data. Because cloud introduces risk, exposure of encryption keys becomes particularly vulnerable. Tokenization eliminates keys by replacing sensitive data with random tokens to mitigate the chance that thieves can do anything with the data if they get it. The transparency inherent in random tokens also reduces remediation costs to applications, databases and other components where sensitive data lives.

That said, analysts recommend that enterprises avoid home-grown tokenization solutions that take shortcuts and don't completely randomize the data because of the complexity. I agree with the analysts. Tokenization must be truly random in order to be effective.

Sign up to our newsletters

More in Opinions

Spotting the "black swans" of security

Spotting the "black swans" of security

How can it be that firms can feel confident in their security technology investments and their people, yet ultimately still believe that they remain at great risk?

Me and my job: Blake Frantz, Center for Internet Security

Me and my job: Blake Frantz, Center for ...

A brief Q&A with Blake Frantz, director of benchmark development, security benchmarks division, Center for Internet Security (CIS).

BlackBerry back in the game

BlackBerry back in the game

Thanks to BYOD, gone are the days of one single mobile device manufacturer or model to support, says Dimension Data Americas' Darryl Wilson.