Generative Data Intelligence

Quantum Particulars Guest Column: “Quantum Security’s Unsung Heroes: a NIST Post-Quantum Cryptography (PQC) Standardization Conference Review” – Inside Quantum Technology

Date:

Tom Patterson, Quantum Security Global Lead for Accenture discusses the recent NIST PQC conference

By Kenna Hughes-Castleberry posted 25 Apr 2024

“Quantum Particulars” is an editorial guest column featuring exclusive insights and interviews with quantum researchers, developers, and experts looking at key challenges and processes in this field. This article, focusing on the NIST Post-Quantum Cryptography (PQC) standardization conference, is written by Tom Patterson the Quantum Security Global Lead at Accenture. 

A very diverse group of unassuming people from around the world met in a nondescript hotel ballroom in the Maryland suburbs of Washington, D.C. on April 10 through 12, and the world got safer. Yes, I’m referring to the fifth National Institute of Standards and Technology (NIST) Post-Quantum Cryptography (PQC) standardization conference, and yes it has made the world safer, and no I’m not exaggerating.  

Why We Gathered at NIST

Today’s hyperconnected worlds of governments, finance, health, transportation and defense stand on the pillar of trust that digital information is protected and can be exchanged safely around the globe. However, even as cyber attacks become more sophisticated and the cybersecurity landscape becomes more difficult to navigate, a new and more foreboding threat has appeared: quantum computers that will soon have the potential to break today’s most widely used cryptography. Q-Day is coming and the time to prepare is now.

Benefits and Risks

Quantum computers will certainly bring many benefits to society but conversely also pose a serious threat to today’s digital cybersecurity. These computers have the ability to quickly factor large numbers into their primes, which obviates the underlying math (and therefore effectiveness) used in today’s public key encryption.

What’s the Fact(or) Jack?

One of the underlying principles of today’s public key encryption that was created back in the 1970s relies on the mathematical concept of factoring. It was realized that computers would not have any trouble multiplying two numbers together and agreeing on a product, and it was thought then and still trusted today that if you make the numbers big enough, we’ll never make a computer fast enough to ‘factor’ that big product and determine what the two numbers were. For instance, we trusted that computers could multiply three and five and get 15, but not the reverse, knowing 15 and trying to factor that to find the five and the three. That was and is the basis for most of the encryption in motion that the world uses every moment of every day, and that is all about to change.

Things are Happening Gradually 

As Ernest Hemingway wrote in 1926, things happen “gradually and then suddenly.” The past 10 years or so, the United States’ NIST has been gradually leading a global effort to agree on a new algorithm that does not rely on the infallibility of factoring, that can support the encryption in motion that the world needs. Over the years, they have evaluated hundreds of different algorithmic approaches and shared the results with the world. The most promising candidates have been subjected to tests ranging from security to performance and more, which continuously whittled down the list.

A Moment of “Suddenly” 

We’re now in that ‘suddenly’ moment in time. After 10 years and hundreds of attempts, this NIST-led group has agreed on the new set of algorithms that will make up the PQC standards. This is critically important for humanity. This standardization is the starting bell for most organizations around the world that have been waiting for ‘the answer’ before they begin ‘their journey.’ 

The Selected Algorithms

The selected algorithms are:

  • CRYSTALS-KYBER: A good all-around performance and security solution based on Key Encapsulation Mechanism (KEM) on structured lattices.
  • CRYSTALS-DILITHIUM: A good digital signature algorithm based on structured lattices. NIST is now recommending this as the primary signature algorithm used due to its combination of security, performance and relative ease of implementation. 

NIST also advanced two other important algorithms forward, with:

  • FALCON: which is also based on structured lattices and requires smaller bandwidth but has a much more complicated implementation. Accordingly, the Falcon standard will come out after the others.
  • SPHINCS+: which is a stateless hash-based algorithm that provides solid security but lacks in performance.  

My Takeaways on the NIST Conference 

  • NIST is doing a great job—both with their in-house leaders and the way they have attracted, involved, empowered and leveraged a highly diverse group of experts in all the myriad aspects of post-quantum cryptography. It’s a truly impressive (and often unheralded) group effort.  
  • This fifth standardization conference was pivotal in communicating the go signs for industry, empowering the experts and engendering the necessary confidence in the results to date. 
  • NIST clearly announced to the world that the organization plans to deliver the PQC standard algorithms this summer. That is ringing the starters bell for CISOs around the world to prepare for quantum encryption now.
  • The time is now to get started. The world’s most important organizations should be developing their own quantum security strategy, including a quantum risk analysis and multi-year roadmap. They should also begin their own discovery process, to identify their vulnerable encryption and what it’s being used to protect today. Then they need to look at the new concept of crypto agility.
  • Crypto agility will be key for enterprises around the world. One important message from this NIST gathering was that even with new standards, algorithms are going to be in a state of flux for a while. NIST announced that the ‘first’ of the new PQC algorithms will be released this summer, and that more will be coming after that. New algorithms for different purposes, some faster, some smaller and some stronger.  
  • Library agility will also be a reality for most multinational organizations, as different countries will want their own instances of these new standards to be used. This will require a lot of agility in the enterprise. Doing this by hand could make the job insurmountable, while leveraging an orchestrated crypto agility engine makes this process routine—and compliant!  
  • Side channel attacks are the real deal, and with any new standard, the implementation is a big point of vulnerability. The side channel attacks from top researchers around the world were very impressive (one showing over 50% chance of getting your keys by monitoring power for six minutes), showing decryption capabilities from precise power monitoring and other side channel methods. Take these seriously and ensure you put as much thought into how you implement these new algorithms as went into creating them. 

Overall, this group of quantum experts are unsung heroes. A special thanks and recognition to NIST, especially Dustin Moody, Bill Newhouse, Angela Robinson and all of the great NIST leaders that have organized these best and brightest in the world to come together on these critical new standards. 

Tom Patterson is the global Emerging Technology Security lead at Accenture. In his role, Tom is responsible for developing and delivering solutions that help governments, companies and organizations leverage critical new technology including quantum, space, 5/6g, AI and more in the most secure fashion for resilient business and critical infrastructure around the world. 

Categories:
Artificial intelligence, cybersecurity, Guest article, quantum computing

Tags:
Accenture, guest article, Tom Patterson

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?