Protecting hardware in the quantum era

4 mins read

Dr. Axel Poschmann looks at how encryption is evolving to address the risks associated with the development of quantum computing.

Credit: blobfishphoto - adobe.stock.com

This year, 2024, is set to be the year that cybersecurity enters the quantum era. As powerful quantum computers capable of easily cracking our current encryption standards edge closer to reality, a radical shift in cryptography is underway to protect the vital components of our technology supply chain.

Experts have been warning of this threat for decades, ever since Peter Shor produced what became known as Shor’s Algorithm in 1994.

Shor revealed that the RSA and ECC public-key cryptography schemes - which are now foundational to modern security - could be theoretically broken by future quantum computers.

Now, with researchers making serious advances towards viable quantum computers, there is an effort underway to mitigate this risk and change the way we encrypt our hardware and software. Without doing so, the massive benefit that quantum computing offers will be outweighed by the security cost posed by quantum attacks.

The National Institute of Standards and Technology (NIST) is leading the charge on this. In 2016, NIST launched its post-quantum cryptography (PQC) standardisation project, aiming to develop a new global benchmark to bolster defences against quantum attacks.

This project culminated in NIST’s draft PQC standards being published in November 2023, which will be ratified and finalised later this year. This is the culmination of an eight-year process which brought together the entire cryptographic community - including government agencies, academics, startups and industry voices.

The standardisation process, though, was the precursor to an even more important task: the global migration to PQC algorithms. This year, we have already seen Apple unveil a PQC protocol to protect iMessage data from quantum attacks, following other security-conscious tech providers like Google Chrome, The Signal Protocol, and Express VPN in taking this step.

There is now real pressure for NIST’s standards to be implemented across the supply chain, and the rest of the sector needs to start following Apple’s example soon. In its recently published CNSA 2.0 guidance, the NSA has already announced that software/firmware signing, web browsers/servers and cloud services should pivot to replace vulnerable cryptography with quantum-secure alternatives by 2030, with quantum-safe algorithms mandated to be the default and preferred option as soon as 2025.

With NIST’s standards soon to be ratified and the migration to quantum-safe cryptography beckoning, quantum security is set to be a competitive advantage for hardware manufacturers going forward. How can they make sure they meet these standards and make the transition in time?

Assessing your current cryptography

The first step in quantum-readiness is to assess the current vulnerabilities in an organisation’s hardware cryptography and to understand the roadmap to PQC. NIST, the NSA, and the CISA have already prepared shared guidance on this process.

Taking an inventory of the quantum vulnerabilities in your hardware is therefore key to modernising cryptography. Cryptographic discovery tools exist to help manufacturers kick-start this process, and enquiring with technology vendors about embedded cryptography can help ensure that all bases are covered.

Running discovery tools on end user systems and servers, network protocols, applications and software libraries can help uncover quantum-vulnerable RSA and ECC algorithms that will need to be replaced under the latest guidance. It’s especially important for the transition process to note where vulnerable cryptography is used to protect sensitive and critical data, and the protocols that are used to access it.

By not taking this first step and understanding their vulnerability, hardware manufacturers could contravene the regulations set out in CNSA 2.0 within the 2025/2030 timeline, and risk being blacklisted from future procurement processes.

Once a company has understood its vulnerability and taken this inventory, it is ready to start implementing PQC.

Keeping crypto agility in mind

Modernising cryptography is an important process, and it’s vital that future updates and threats are kept in mind during it. Many organisations through the comprehensive audit of their IT infrastructure will find they have a mix of legacy cryptography with varying levels of vulnerability and this transition process towards PQC will be an opportunity to take positive steps in upgrading their wider security posture. PQC is a new paradigm in cryptography with different compute requirements which presents opportunities to rethink how organisations approach data security.

Those companies that have completed their cryptography audit and prepared a transition roadmap have already gotten ahead of this threat and should begin testing implementations of PQC. Organisations can decide between two approaches, a direct transition to post-quantum cryptography favoured by the US and UK, or a hybrid approach sometimes referred to as post-quantum/traditional (PQ/T) and recommended by regulators in France and Germany.

WIth NIST’s preference for a direct transition, it has placed these implementations into two broad categories of PQC based on different families of mathematics, hash functions and lattice functions. Meanwhile, in Europe, France’s ANSSI and Germany’s BSI also recommend classic McEliece for key exchange which is a code-based algorithm.

Hash functions are a better-known solution as they are currently used in the SHA series, but for NIST’s PQC standards they are limited to digital signatures. Lattice-based cryptographic functions, on the other hand, are a novel implementation. Testing these implementations requires a greater level of investment, including into their resistance to side channel attacks.

Many will therefore be tempted to focus their initial efforts on hash-based signatures, given their parallels with existing solutions and opportunities to implement them in a hybrid fashion for code signing use cases. However, guidance has made it clear that lattice-based signatures are here to stay and will become the standard solution therefore you should look to start your transition to a quantum secure future by tackling the lattice-based schemes.

This is why hardware manufacturers must consider the agility of their cryptography, and not just the modernity. Companies building products that are likely to be in the field beyond the 2030 deadline set out in CSNA 2.0 need to take into account the longevity of their solutions. Even France who accepts a hybrid transition period after 2025 would expect a full transition to quantum-safe cryptography after 2030.

As a rule of thumb, if you expect your products to be used by customers in five to 10 years’ time then you should be exploring implementation of lattice-based functions.

This might lengthen the product design timeline in the short term, as manufacturers will need to communicate the new power and space requirements for lattice-based cryptography to their vendors to make sure that future products are secure. By designing with the future in mind, they will be able to stay agile and open up further opportunities for innovation in 2030 and beyond.

Accelerating the transition

We are already seeing a huge demand for hardware products to support the shift to the cloud and adoption of AI. Alongside this, it is crucial that we embed agile quantum security into our physical IT infrastructure in order to safeguard these technologies from future threats.

This transition to post-quantum cryptography will be an immense technological evolution, like the leap from playing cassettes to streaming music.

Now, hardware manufacturers that implement PQC solutions have a real chance to get ahead of the market, be the first-movers into the post-quantum world, and provide truly secure products to customers that will desperately need them. 

Author details: Dr. Axel Poschmann is Vice President of Product at PQShield