Encryption is better equipped than tokenization to secure data in the cloud.

Share this article:

FOR

Ian Huynh, VP of engineering, Hubspan

Think of end-to-end encryption as purchasing a comprehensive insurance policy for all aspects of your life. With this insurance, no matter what happens, you are covered and secure in knowing that nothing has been omitted. Likewise, with full encryption, your data is completely covered – secured from the origin and as it travels to its destination, both at rest and in motion.

Another data protection approach many companies use is tokenization. While still a valid security approach, tokenization could omit sensitive data that should be protected. Since data requirements change over time, full encryption gives the assurance needed to confidently operate your business without the fear that some data element was inadvertently omitted.

There are some concerns that end-to-end encryption can have a negative impact on performance. In most cases, the performance experience using full encryption is acceptable, and the benefits of lower risk far outweigh any performance hit, especially when it comes to protecting critical data.

AGAINST

Ulf Mattsson, CTO, Protegrity

One of the biggest concerns about the cloud is the threat of data being stolen. Next-generation tokenization is a better option for securing data in the cloud than encryption because it is transparent, faster, more secure and more scalable.

The cloud is a high-risk environment that decreases administrators' ability to control the flow of sensitive data. Because cloud introduces risk, exposure of encryption keys becomes particularly vulnerable. Tokenization eliminates keys by replacing sensitive data with random tokens to mitigate the chance that thieves can do anything with the data if they get it. The transparency inherent in random tokens also reduces remediation costs to applications, databases and other components where sensitive data lives.

That said, analysts recommend that enterprises avoid home-grown tokenization solutions that take shortcuts and don't completely randomize the data because of the complexity. I agree with the analysts. Tokenization must be truly random in order to be effective.

Share this article:

Sign up to our newsletters

More in Opinions

Unfair competition: Proactive preemption can save you from litigation

Unfair competition: Proactive preemption can save you ...

With each job change, the risk that the new hire will bring confidential information or trade secrets with him or her to the new company grows.

Hackers only need to get it right once, we need to get it right every time

Hackers only need to get it right once, ...

Hackers only need to find one weak point to steal valuable information. On the flip side, security pros need to account for every possible scenario.

Successful strategies for continuous response

Successful strategies for continuous response

While it isn't realistic for organizations to expect that it will never happen to them, a rapid, professional and continuous response can limit their scope and reputational impact.