Encryption is better equipped than tokenization to secure data in the cloud.

Share this article:

FOR

Ian Huynh, VP of engineering, Hubspan

Think of end-to-end encryption as purchasing a comprehensive insurance policy for all aspects of your life. With this insurance, no matter what happens, you are covered and secure in knowing that nothing has been omitted. Likewise, with full encryption, your data is completely covered – secured from the origin and as it travels to its destination, both at rest and in motion.

Another data protection approach many companies use is tokenization. While still a valid security approach, tokenization could omit sensitive data that should be protected. Since data requirements change over time, full encryption gives the assurance needed to confidently operate your business without the fear that some data element was inadvertently omitted.

There are some concerns that end-to-end encryption can have a negative impact on performance. In most cases, the performance experience using full encryption is acceptable, and the benefits of lower risk far outweigh any performance hit, especially when it comes to protecting critical data.

AGAINST

Ulf Mattsson, CTO, Protegrity

One of the biggest concerns about the cloud is the threat of data being stolen. Next-generation tokenization is a better option for securing data in the cloud than encryption because it is transparent, faster, more secure and more scalable.

The cloud is a high-risk environment that decreases administrators' ability to control the flow of sensitive data. Because cloud introduces risk, exposure of encryption keys becomes particularly vulnerable. Tokenization eliminates keys by replacing sensitive data with random tokens to mitigate the chance that thieves can do anything with the data if they get it. The transparency inherent in random tokens also reduces remediation costs to applications, databases and other components where sensitive data lives.

That said, analysts recommend that enterprises avoid home-grown tokenization solutions that take shortcuts and don't completely randomize the data because of the complexity. I agree with the analysts. Tokenization must be truly random in order to be effective.

Share this article:
You must be a registered member of SC Magazine to post a comment.

Sign up to our newsletters

More in Opinions

Me and my job: Chris Sullivan, vice president of advanced solutions, Courion

Me and my job: Chris Sullivan, vice president ...

This month we get to know Chris Sullivan, vice president of advanced solutions at Courion.

Threat of the month: SVPENG

Threat of the month: SVPENG

We take a closer look at SVPENG, malware that's capable of launching two different types of attacks.

Security assessment stability

Security assessment stability

We should be asking if it is worth the cost of constantly switching security assessment companies, says Ken Stasiak CEO, SecureState.