NSA's mission to crack internet data intensifies call for stronger encryption standards

Share this article:

As more details surface on the lengths the National Security Agency (NSA) has gone in the professed name of stopping terrorist threats, the incidents only reinforce the need to embrace a stronger standard for encryption, security experts share.

On Thursday, The New York Times, The Guardian and ProPublica teamed up to reveal new findings: that the NSA and its U.K. equivalent have used their expansive resources to engage in a years' long mission to undermine encryption methods widely used to secure communications sent over the web.

The Guardian obtained the files – which were leaked by whistleblower and former NSA contractor Edward Snowden. According to the documents, NSA spends $250 million a year on a program called the Sigint Enabling Project, which subverts methods for securing public data.

The 50,000 pages of leaked documents also revealed that the NSA pressured major tech companies into giving the agency backdoor access to encryption software, and that, when all else failed, the NSA outright stole company encryption keys by hacking organizations' servers, a Thursday article in The Guardian said.

In addition, the documents showed that the Government Communications Headquarters (GCHQ), a British intelligence agency, has been hard at work to figure out how to decipher web traffic encrypted by Google, Facebook, Yahoo and Hotmail, the world's major service providers for email, instant messaging and other social media communications.

On Friday, Pravin Kothari, founder and CEO of cloud encryption firm CipherCloud, told SCMagazine.com that the widely accepted encryption protocol on the internet has been RSA 1024-bit encryption, which security experts have continued to warn could be easily broken by those with the enough skill and computer power.

NSA can monitor internet traffic by exploiting compromised encryption software, Kothari explained. "And the second problem is, NSA can ask the provider for their decryption keys, so they decrypt it automatically,” he emphasized.

In a follow up email, Kothari added that many companies have dragged their feet in adopting more secure encryption methods for fear that it could negatively affect services.

“Experts have been advising to increase the key length to 4096-bit or longer for some time, but many internet providers are slow to upgrade due to significant performance impact,” Kothari wrote.

In a recent blog post, security expert and cryptographer Bruce Schneier suggested similar steps for widespread adoption of better security.

“It's pretty easy to stay a few steps ahead of the NSA by using even-longer keys,” Schneier wrote. “We're already trying to phase out 1024-bit RSA keys in favor of 2048-bit keys. Perhaps we need to jump even further ahead and consider 3072-bit keys. And maybe we should be even more paranoid about elliptic curves and use key lengths above 500 bits,” he said.

Share this article:

Sign up to our newsletters

More in News

Research shows vulnerabilities go unfixed longer in ASP

Research shows vulnerabilities go unfixed longer in ASP

A new report finds little difference in the number of vulnerabilities among programming languages, but remediation times vary widely.

Bill would restrict Calif. retailers from storing certain payment data

The bill would ban businesses from storing sensitive payment data, for any long than required, even if it is encrypted.

Amplification, reflection DDoS attacks increase 35 percent in Q1 2014

Amplification, reflection DDoS attacks increase 35 percent in ...

The Q1 2014 Global DDoS Attack Report reveals that amplification and reflection distributed denial-of-service attacks are on the rise.