For years, the National Institute for Standards and Technology have been working on a project to identify and vet a handful of new encryption algorithms that can help protect federal computers and systems from hacking threats powered by quantum computing.
On Tuesday, the agency announced four new algorithms that will underpin its future cryptography standards by 2024. They include one algorithm for general encryption purposes (CRYSTALS-Kyber) and another three for digital signatures and identity verification (CRYSTALS-Dilithium, Falcon and Sphincs+).
NIST mathematician and project lead Dustin Moody told SC Media that at this stage, all the finalists had met baseline standards and the choice came down to small but measurable differences in things like speed and ease of use.
“Security was our No. 1 criteria … but for all of the finalists we had enough confidence in that regard, so our second criteria was performance: looking at key size, signature size, how much memory is involved when you implement it, looking at benchmarks, how fast you implement it on a variety of platforms,” he said in an interview.
Three of the selections (CRYSTALS-Kyber, Crystals-Dilithium and Falcon) are lattice-based algorithms. The NSA has already said that it also intends to select a lattice-based solution for its own next generation encryption.
CRYSTALS-Kyber beat out two other similar candidates for general encryption, largely because it had slightly stronger documentation.
“In our view [Kyber] was the strongest, technically, of the three when we considered security and performance and it's implementation numbers,” Moody said in an interview. “The other two were not too far behind it but we … felt that Kyber had a little bit more of an advantage.”
CRYSTALS-Dilithium and Falcon are also lattice-based, but NIST expects most organizations to use Dilithium because it performs well, has strong documentation and is “also a lot simpler to implement than Falcon.” While Falcon will require a complex implementation and may not work on all devices, it’s also smaller and there are certain use cases for applications that use smaller digital signatures, so NIST decided to include both.
The fourth selection, Sphincs+, was determined to be the strongest non lattice-based solution for digital signatures, in line with the agency’s long-held belief that it will need to develop back up options in case future weaknesses are discovered in any one post-quantum cryptographic approach.
“We wanted to ensure that we had another algorithm in case someone discovers a breakthrough and there’s some attack on lattices … we want to have an algorithm based on another type of [cryptographic] family,” he said. “It has really good security analysis, it’s a little bit larger and slower, so we don’t think it will be used quite as much but we wanted to be ready.”
It's also why NIST and others have consistently promoted the concept of "crypto-agility," or building encryption protocols that can switch out different algorithms with as little impact to performance and reliability as possible. While many experts believe the algorithms that have made it to this stage have proven they can defend against hacks from a cryptographically-relevant quantum computer, the fact that such a thing does not currently exist means there are some assumptions built in.
"It is currently not clear if they can be broken, but it is clear that after looking very carefully we did not find a trivial way," said Vincent Berk, chief strategy officer for Quantum Xchange, a company that sells encryption and secure communications tools based on NIST finalist algorithms.
While mathematicians and cryptographers have done all the due diligence they can, quantum computers are meant to solve problems innumerably more complex than humans are capable of, and thus it wise not to base the safety of the world's data on any one approach that could represent a single point of failure.
"It took over 350 years to crack Fermats last theorem. These are the timescales at which we solve hard math problems. It is very possible that a quantum, or even a conventional binary computer will be able to break newer, as well as existing crypto technology," said Berk. "Having a larger range of underlying math problems for our crypto allows us to harden a bit against the possibility of one finding an efficient algorithm to crack one or more of them."
Moody said the next set of algorithms will likely be announced in the next year and a half and none will be lattice-based. While four additional tools will be evaluated, the agency expects to only select one or two others to enshrine into U.S. standards, including another option for general encryption. Although the public will need to wait to learn what those selections will be, NIST officials have already begun drafting standards for the four algorithms announced July 5, and still expects to complete the standards process by 2024.
That will put federal agencies on target to meet goals laid out by the Biden administration, which earlier this year issued a security memo establishing a host of timelines and mandates for the “timely and equitable” replacement of public-key algorithms in federal systems and devices.
Individual agencies will have a year to inventory all federal systems and assets that rely on public-key algorithms, the form of classical encryption most likely to be broken by a future quantum computer. Those plans will eventually be submitted to the Cybersecurity and Infrastructure Security Agency and the Office of the National Cyber Director, who will scope out budget and funding needs surrounding the transition by October 2023.
It's not just federal agencies that will likely end up using these standards. Multiple companies and experts in post-quantum cryptography have told SC Media that NIST standards will likely end up being adopted by large swaths of the private sector as well as international standards bodies.