Encryption is better equipped than tokenization to secure data in the cloud.

Share this article:

FOR

Ian Huynh, VP of engineering, Hubspan

Think of end-to-end encryption as purchasing a comprehensive insurance policy for all aspects of your life. With this insurance, no matter what happens, you are covered and secure in knowing that nothing has been omitted. Likewise, with full encryption, your data is completely covered – secured from the origin and as it travels to its destination, both at rest and in motion.

Another data protection approach many companies use is tokenization. While still a valid security approach, tokenization could omit sensitive data that should be protected. Since data requirements change over time, full encryption gives the assurance needed to confidently operate your business without the fear that some data element was inadvertently omitted.

There are some concerns that end-to-end encryption can have a negative impact on performance. In most cases, the performance experience using full encryption is acceptable, and the benefits of lower risk far outweigh any performance hit, especially when it comes to protecting critical data.

AGAINST

Ulf Mattsson, CTO, Protegrity

One of the biggest concerns about the cloud is the threat of data being stolen. Next-generation tokenization is a better option for securing data in the cloud than encryption because it is transparent, faster, more secure and more scalable.

The cloud is a high-risk environment that decreases administrators' ability to control the flow of sensitive data. Because cloud introduces risk, exposure of encryption keys becomes particularly vulnerable. Tokenization eliminates keys by replacing sensitive data with random tokens to mitigate the chance that thieves can do anything with the data if they get it. The transparency inherent in random tokens also reduces remediation costs to applications, databases and other components where sensitive data lives.

That said, analysts recommend that enterprises avoid home-grown tokenization solutions that take shortcuts and don't completely randomize the data because of the complexity. I agree with the analysts. Tokenization must be truly random in order to be effective.

Share this article:
You must be a registered member of SC Magazine to post a comment.

Sign up to our newsletters

TOP COMMENTS

More in Opinions

Me and my job: Michael Canavan, Kaspersky Lab North America

Me and my job: Michael Canavan, Kaspersky Lab ...

We catch up and learn a bit more about Michael Canavan, senior director, systems engineering, Kaspersky Lab North America.

Embracing BYOD...with safeguards

Embracing BYOD...with safeguards

It's possible to safely manage the security risks posed by BYOD, says Anders Lofgren at Acronis Access.

Becoming a "security thinker"

Becoming a "security thinker"

Active security thinking ensures that we don't simply perpetuate security folklore.