Content

Encryption is better equipped than tokenization to secure data in the cloud.

FOR

Ian Huynh, VP of engineering, Hubspan

Think of end-to-end encryption as purchasing a comprehensive insurance policy for all aspects of your life. With this insurance, no matter what happens, you are covered and secure in knowing that nothing has been omitted. Likewise, with full encryption, your data is completely covered – secured from the origin and as it travels to its destination, both at rest and in motion.

Another data protection approach many companies use is tokenization. While still a valid security approach, tokenization could omit sensitive data that should be protected. Since data requirements change over time, full encryption gives the assurance needed to confidently operate your business without the fear that some data element was inadvertently omitted.

There are some concerns that end-to-end encryption can have a negative impact on performance. In most cases, the performance experience using full encryption is acceptable, and the benefits of lower risk far outweigh any performance hit, especially when it comes to protecting critical data.

AGAINST

Ulf Mattsson, CTO, Protegrity

One of the biggest concerns about the cloud is the threat of data being stolen. Next-generation tokenization is a better option for securing data in the cloud than encryption because it is transparent, faster, more secure and more scalable.

The cloud is a high-risk environment that decreases administrators' ability to control the flow of sensitive data. Because cloud introduces risk, exposure of encryption keys becomes particularly vulnerable. Tokenization eliminates keys by replacing sensitive data with random tokens to mitigate the chance that thieves can do anything with the data if they get it. The transparency inherent in random tokens also reduces remediation costs to applications, databases and other components where sensitive data lives.

That said, analysts recommend that enterprises avoid home-grown tokenization solutions that take shortcuts and don't completely randomize the data because of the complexity. I agree with the analysts. Tokenization must be truly random in order to be effective.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.