It’s been a little over three months since NIST announced the first Post Quantum Cryptography (PQC) algorithms selected after Round 3 for standardization and additional ones undergoing further study in Round 4. Since then there have been other developments including an announcement that one of the Round 4 candidates, SIKE, has been broken and also a call by NIST for proposals for additional PQC digital signature algorithms in order to provide more diversity. In addition, NIST kicked off a Migration to Post-Quantum Cryptography Project and recruited a a consortium of twelve corporate partners to bring awareness of the challenges involved in migrating from the current set of public-key cryptographic algorithms to quantum-resistant algorithms.

To get a better feel for how the implementation is going and an idea of some of the key issues we interviewed a few individuals involved in the PQC efforts to get their views on selected topics. Individuals we talked to include Thomas Pöppelmann from Infineon, Vincent Berk from Quantum Xchange, Rebecca Krauthamer from QuSecure, Vadim Lyubashevsky from IBM, Wil Oxford from Anametric, Michele Mosca and Andrew Hammond from evolutionQ, and Jack Hidary from SandboxAQ. We learned several interesting things.

The first question we asked was when do they recommend a CIO get serious about starting a quantum safe program within their company. The universal answer we received from the experts is that an organization should start one immediately, if they haven’t already done so. They also saw challenges in convincing some users of the need for immediacy. CIO’s already have multiple non-quantum related cybersecurity problems and are certainly not eager to add an additional one. Also, the quantum threat is not as readily visible as something like a ransomware attack. There is no hard deadline on when becoming quantum safe is absolutely needed, which was a huge motivating factor with Y2K. And within a corporation, cybersecurity is viewed as a cost center rather than a revenue generator. So getting a budget for this from corporate management is a little more challenging. In a few cases, there is some naivety from end users about how hard it is to implement post quantum security. Some may believe it is just another firmware update that they will be able to quickly implement. Although that may be true for consumer devices like PCs and cell phones, for enterprise computing and IoT devices, it will be much more complicated.

An additional challenge with CIOs who do follow PQC and the NIST competition is the recent breaking of the Rainbow and SIKE algorithms. They are also aware that although NIST has selected some algorithms, the formal approval and publishing of the standards with recommended parameters is still about two years off. So, some of them are worried about implementing a technology which is not fully baked! The CIOs don’t want to spend a bunch of money to implement a PQC algorithm in their organization and find out later that it gets broken.

The PQC vendors, of course, have multiple arguments against these concerns. First, they point out that converting to a quantum safe infrastructure will take a long time. It is not as simple as implementing a simple firmware update. To start, identifying all the areas where a PQC upgrade is needed can be a challenge. Many enterprise CIOs have IT infrastructures have been created over a multiple decades and many of the original implementers may no longer be working at the company. Just taking an inventory of where cryptography is used in the infrastructure may be a challenge. One of the tools PQC vendors can deploy to help this are the various analyzers that can inspect a network and identify where cryptography is used. Jack Hidary of SandboxAQ mentioned that this was a benefit they derived from their acquisition of Cryptosense. Although both companies had analyzer software of their own, the two pieces were complementary, and SandboxAQ has already integrated the two pieces together to have something even more effective.

Another recommendation we heard from many of the vendors is to use hybrid algorithms that combine the classical RSA or Elliptical curve encryption with one of the PQC algorithms. There are several reasons for recommending this approach. First, it provides some insurance in case a weakness is found in the PQC algorithm. A hacker would not only have to break the PQC algorithm, but also the classical RSA or elliptical curve algorithms too. And the classical algorithms have already survived for several decades without anyone finding a weakness. In addition, end users are very reluctant to have vendors work on routines deep within their systems, particularly with software modules involved with security. It is much easier for an end user to accept a vendor adding additional modules outside their core infrastructure than to do a rip and replace of a module deep inside their systems. In fact, some companies in regulated industries may have requirements to continue using the classical cryptography due to certain standards they need to meet. And, it is expected that these standards will be very slow to change.

One concept we hear frequently is that of crypto agility. The overall architecture of the PQC modules should be implemented so that it is easy to change parameters or complete algorithms. As mentioned above, there are still concerns that a weakness might be found in the future in a selected algorithm or a recommend is made to change the specific parameters used with a particular algorithm. The NIST competition was designed with this possibility in mind because it has emphasized selecting multiple algorithms that can be used for achieving algorithm diversity. Although it takes more effort to design an agile architecture that can more easily adopt to different algorithms than a design to fit a very specific one, it may well be worth the effort. One area where agility may be more of a challenge is to design IoT chips with integrated cryptography modules. These devices are typically more constrained in memory, processor performance, power, and cost than larger computers that are installed in a data center. So the chip companies will need to work hard to provide agility in a single chip.

One trend we heard from several of the vendors was to recommend the use of quantum random number generators (QRNG) as the entropy source for creating the encryption keys. Today, most classical encryption systems uses pseudo-random number generators and the concern was that with the increased power of large computers and the use of advanced machine learning, there is a possible risk of someone discovering the key. In addition, the PQC algorithms generally require larger key sizes and many more key exchanges in order to complete a communication. So more entropy is required and using non-deterministic QRNGs helps to reduce the risk. evolutionQ has published a good report titled Quantum Random-Number Generators Practical Considerations and Use Cases for more information.

Another consideration for implementing PQC is the need for testing. A user should remember that converting from classical encryption to one of the PQC or hybrid approaches will likely result in changes in the system latency, key sizes, and ciphertext size. Although in many systems, this may not matter, there may be a few systems where is causes a problem. So this is another reason why it is wise to start now to provide time for testing to be performed and fixes implemented before an enterprise goes live.

Nonetheless, the vendors are reporting increased level of interest from end customers investigating PQC. Converting the worldwide digital communications infrastructure to utilize quantum safe approaches will take 10-20 years and require massively more resources than any previous security upgrade. The vendors are reporting that some of the earliest adopters are within the DoD, banks, and telecom companies. Although the PQC algorithms are open sourced that will be significant revenue opportunities for vendors in this market by providing ancillary software and consulting services. And the vendors are taking different approaches to serving their customers. Quantum Xchange has developed a product called Phio Trusted Xchange (TX) that provides a unique key delivery system and crypto-diverse management platform and eliminates single points of failure. QuSecure has just signed a distribution deal with Arrow Electronics to have Arrow’s 8,000-person sales force represent QuSecure’s solution to its federal and commercial customer base. SandboxAQ has taken an unusual approach for a startup company and has established a Strategic Investment Program to invest in or acquire other companies working in this segment. And IBM has announced a new z16 mainframe computer with built-in functionality to provide quantum-safe cryptography.

So there will be many opportunities for the vendors of quantum safe products in the years ahead and we just hope they can stay one step ahead of the bad guys who will do everything they can to break into systems using any security hole they can find.

October 22, 2022