At its annual Quantum Summit in New York, IBM unveiled its quantum roadmap for both hardware and software through the next 10 years. It represents a continuation of the roadmaps they have presented but also promises a disruptive leap for 2024.

Entering a New Era of Quantum Utility

IBM views the quantum journey as divided into three eras. The first, the era of emergence,  they date from their first cloud machine in 2016. The second, the era of utility which IBM believes is starting now is when quantum computers are able to solve problems beyond brute force classical simulation. The third, quantum at scale, will use error correction to fully realize the promise of quantum computing.

IBM has published a paper in Nature magazine describing Quantum Utility, a blog post that summarizes the findings in the Nature paper, and another blog article with IBMs definition of Quantum Utility.

As defined by IBM Quantum Utility reflects many of the attributes of what Google  calls “Beyond Classical” computation (formerly Quantum Supremacy), though it puts less emphasis on a complexity-theoretic justification of quantum performance.

GQI notes that in the original conception of the term, all Noisy Intermediate Scale Quantum devices were supposed to be capable of  ‘beyond classical’ calculations. However, the term ‘NISQ’ has become debased by its application to a long list of present day devices that come
nowhere near to this. GQI has termed what IBM describes as The Era of Utility as the True NISQ era. We see a growing body of evidence that such systems will drive very interesting results in frontier quantum science applications. It remains an area of hot debate and disagreement in the field to the extent to which such systems may also find application in commercially useful applications.

Milestones Achieved in 2023

It is always risky for a company to provide a roadmap for the future because sometimes technical issues get in the way! And, to be sure, analysts like us will save those original roadmaps and point out when a company doesn’t meet their original plan. Below is the roadmap that IBM presented a year ago at the Quantum Summit 2022. At the time, they indicated that by 2023 they would have a new Heron device with improved qubit quality, a Condor internal-only test device to try out qubit scaling and fridge capacity, Quantum Serverless for easier user management of their jobs, and several other things. As you can see by the green checkmarks in the figure below, they will be achieving all those milestones by the end of the year.

New Hardware Architecture Based on Tunable Couplers

With the Heron processor, IBM has successfully brought together the two major architectural steams it has been developing for some years: the scale-oriented fabrication technology of its ‘big bird’ chips (notably Eagle and Osprey), and the quality-oriented tunable coupler technology it has been developing in its recent ‘little bird’ architectures (notably Falcon R8 and Egret) . 

Tunable couplers are implemented as a Josephson junction that is placed between two adjacent qubits. Such a feature was first implemented by Google in their Sycamore device and has since been adopted by many of the superconducting processor developers including Rigetti, Toshiba, systems in China, and others. A tunable couple helps to isolate neighboring qubits from unwanted interactions and greatly reduces issues like crosstalk. For IBM, it also provides a route to faster gate speeds while still retaining its fixed frequency transmon qubit approach.

The result is a significant improvement in qubit quality at scale. Shown below is a chart which compares their best 127 qubit Eagle class processor, codenamed Sherbrooke, with version 1 of their new 133 qubit Heron class processor, codenamed Montecarlo. This shows an improvement in the EPLG measure (see section below) by over 60% with additional improvements that will be made in Heron revision 2.

An observation GQI makes of many quantum computing roadmaps is that vendors project future fidelity improvements without any clear evidence supporting how these will be realized. Importantly IBM has a strong explanation of the lever they have developed to deliver the projected Heron R2 fidelity improvements: two Level system (TLS) tunability. TLS defects are a key source of superconducting circuit device noise. IBM has been able to maintain its focus on fixed frequency transmon qubits (relatively long lived for superconducting circuit qubits) but instead to ‘tune away’ the TLS defects to avoid collisions that interfere with qubit operations. In particular the qubits worst affected by TLS defects can be identified and fixed, and median qubit lifetime extended. Taken together with other advanced techniques for error suppression, GQI anticipates Heron R2 achieving average 2Q fidelity of 99.9%.

Updated Performance Measures

IBM first introduced the Quantum Volume benchmark in 2017 to measure processor performance and the CLOPS metric to measure speed in 2021. But as the processor sizes have increased beyond 100 qubits, they feel that these measures are not sufficient to provide a complete picture for the larger processors. One of the issues with Quantum Volume is that it allows someone to pick the best set of qubits in the machine to run the test, but that doesn’t give a good representation of the performance of the average qubit. For example, one could take a 100 qubit processor and cherry-pick the 20 best qubits to run the Quantum Volume test to achieve a Quantum Volume measure of 1,048,576. But that wouldn’t indicate that the other 80 qubits on the device are of really poor quality. Another problem with Quantum Volume is that it requires a classical simulation so that someone can compare the results from their quantum device with an exact classical simulation. But as the number of qubits gets higher, it will no longer be possible to run the classical simulation. So, for these reasons, IBM has developed a new metric they called Error per Layer Gate (EPLG or Layer Fidelity) and also updated the CLOPS measurement to a new measure called CLOPSh which measures the performance in a more hardware-aware way. A blog post describing these new measures is available here.

GQI understands the logic that has driven IBM to move beyond a focus on Quantum Volume. Layer Fidelity should be a useful measure for those tracking technical improvements across device generations, and for those targeting Utility Era applications. However, its wider adoption across the community seems less certain. Those now committed to a focus Early FTQC roadmaps will probably seek measures more intimately tailored to the error correction approaches they are targeting. At a more abstract level, GQI feels there may be merit in future measures that actively put a premium on the rate at which entanglement can be spread across a device. This seems to be emerging as a key indicator of how hard a calculation is to approximate classically.

Evolving Software Focus

A key part of IBMs strategy has been to use access to its systems to build its network of collaborators, early adopters and partners in the market.

From its launch in 2017, Qiskit has been at the heart of IBM’s growing influence in the quantum software market. IBM’s plans continue to emphasize the role of their software platform. They are continuing to update Qiskit and announced that release 1.0, which they label as the first stable release of the software, will be available in February 2024. The growing functionality of Qiskit Runtime is being augmented by the Quantum Serverless offering to provide a PaaS environment able to flexibly combine quantum and classical resources that will be required for practical applications. They are also announcing a generative AI code assistant called Qiskit Code Assistant which will be a powerful tool to help customers learn the best ways to use both Qiskit and IBM Quantum Platform services. IBM is now planning to extend these tools with Qiskit Patterns (templates for common application types) and Quantum Functions, which are envisaged to leverage up Qiskit Patterns into true SaaS offerings.

IBM’s focus is moving up the stack. If this can be followed up by the delivery of real capabilities integrating a competitive suite of conventional compute capabilities and augmented by the ongoing revolution in AI, GQI thinks this could become the new big cloud story in its own right.

The IBM Quantum Network

IBM is also continuing to install quantum processors in remote locations outside its New York state facilities. Currently there are four Quantum Computation Centers operational at Fraunhofer in Germany, University of Tokyo in Japan, Cleveland Clinic in Ohio, and Pinq2 in Canada. In 2024 they are scheduled to install more machines at Rensselaer Polytechnic in upstate New York and Yonsei University in South Korea. And in 2025, they are projected to install machines at Ikerbasque in Spain and another international location that has not yet been announced. In addition to these Quantum Computation Centers focused on specific customers, IBM will also open in 2024 another quantum data center  at its facility in Ehningen, Germany. This will operate in a manner  similar to the U.S. data center they currently have in Poughkeepsie, New York.

IBM has long been active in cultivating its own ecosystem of quantum partners, particularly in the quantum midstack. In a sign of the growing maturity of this strategy IBM has announced the integration of Q-CTRL Embedded error suppression software into IBM’s Quantum Pay-As-You-Go Plan. Based on Q-CTRL’s respected Fire Opal technology, an end user can turn on this software with one line of Qiskit code to invoke Q-CTRL’s performance management software and cause substantial error reduction on IBM’s 127 qubit processors.

GQI notes that the presence of midstack vendors such as Q-CTRL and QEDMA speaking at the IBM Quantum Summit are a strong indication of the importance IBM places on the role of the midstack in unlocking opportunity for Utility Era systems. The presence of vendors such as Algorithmiq point to the role quantum algorithm and application developers will also play.

The UK NQCC has recently partnered with IBM to allow the strong and extensive quantum software research community in the UK access to premium tier IBM systems.  GQI views this as a significant endorsement of IBM’s view that its coming Utility Era systems will be influential. The NQCC, as a key pillar of the UK NQTP is very well informed, and other things being equal might have been expected to favor indigenous UK hardware.

IBM cite growing usage statistics and indicate they have 569 thousand users of Qiskit, 2595 technical papers published relating to the IBM quantum systems, and have 62 industry members of the IBM Quantum Network. A survey conducted in 2022 by the Unitary Fund found that the IBM Qiskit platform was the most popular development platform with 81% of the respondents surveyed.

GQI views the momentum that IBM has built in usage of its software frameworks as very significant, a pillar of its strategy every bit as important as its quantum hardware. IBM wants to see Qiskit go on to be as influential long term in quantum computing as NVIDIA’s CUDA has been in conventional high performance computing.

A key challenge for IBM will be the continued management of the expectations of its government and end-user collaborators. Particularly if the road to real applications relevant to their interests does turn out to be long.

Disruptive Progress – Error Mitigation and Error Correction

IBM has previously described many of their efforts in error suppression, avoiding qubit errors at source using advanced control techniques. Implementing the Q-CTRL Embedded option is the most recent extension of this philosophy.

IBM has also increasingly emphasized the potential of error mitigation techniques: calibration-efficient measurement , probabilistic error cancellation, zero noise extrapolation, circuit knitting, and others. These post-processing techniques leverage classical resources to improve the effective quality and performance levels of the underlying device. In 2023 IBM completed an impressive demonstration of this approach using these techniques to perform a 60 layer deep circuits on their 127 qubit Eagle processor (a total of some 2880 CNOT gates – see the Nature paper Evidence for the utility of quantum computing before fault tolerance)

GQI views this results as a significant demonstration of the efficacy of these error mitigation techniques and that they can be relatively systematically applied at useful scales. Crudely, GQI sees them as adding ‘an extra nine’ of fidelity to the system performance, certainly better than we had anticipated.

IBM will continue to find improvements in both the physical hardware, controls, and firmware to improve the qubit quality before error correction. In 2024 they want to demonstrate a system that implements 5,000 error-free gates without error correction codes (a goal previously referred to as the 100 x 100 Challenge).  By 2028 they want to improve that to a system that can do 15,000 gates, still without error correction. IBM wants to enable users to achieve early commercially useful quantum advantage via this approach.

GQI regards these as very significant targets. If realized, they are certainly worthy of the disruptive claim IBM is making. While Google and USTC have previously demonstrated at or near ‘beyond classical’ calculations, those systems have struggled to move into routine deployment at this scale. Xanadu’s similar demonstrations are within a more limited model of quantum computation. IBM’s proposal promises to bring Interesting experiments in frontier quantum science into more routine range.  Though the wider quantum sector remains deeply split on whether this scale is enough for practical commercially useful applications.

In any event, GQI believes such progress is set to put severe pressure on hardware developers whose business model/investor pitch relies on securing commercial revenue in this era but who cannot operate at this scale.  The honeymoon period for hardware roadmaps is narrowing.

Beyond these gate targets, IBM believes it will need to use error correction  and have an ultimate goal of creating a system, codenamed Blue Jay, that supports 1 billion error free logical gates by 2033. Their latest roadmap includes the future machines they see as steps along this path.

Up until now, IBM has been relatively quiet about their efforts to develop error correction technology for future use. GQI was surprised to hear their claim that they have the largest group of engineers working on error correction in the industry.

Earlier this year, IBM quietly posted a paper on arXiv titled High-threshold and low-overhead fault-tolerant quantum memory that describes their approach to error correction. As a first step, they are moving away from the heavy-hex topology they had previously used which provided qubit-to-qubit connectivity to only the nearest 2 or 3 neighboring qubits. The new topology will have each qubit coupled to 6 other qubits and will include long range connections within a module called C-couplers and couplers between modules called L-couplers. This will enable use of a much more efficient Q LDPC (Quantum Low Density Parity Check) code. In the paper, IBM described how they could implement a memory circuit containing 12 logical qubits using 288 physical qubits with a physical error rate of 0.1%. They expect to demonstrate this in hardware in 2026 on a system codenamed Kookaburra.

Including IBM, GQI now identifies three quantum hardware vendor roadmaps that target the use of Q LDPC codes. This holds out the prospect of significantly lower overheads compared to the standard route to FTQC based on the 2D surface code. The difficulty is that Q LDPC codes require more than just 2D nearest neighbor connectivity. Though only one element in an overall computing architecture, IBM’s proposal for Q LDPC based quantum memory is notable in providing the first view of a plausible model for how such a scheme could be realized with superconducting circuit based qubits.

GQI views the development of these models, Q LDPC codes and other novel approaches to easing the burden of error correction such as CAT qubits and the LHZ architecture as a major potential disruptor for the sector. None is yet backed by the depth of theoretical research supporting the 2D surface code. However, with any step taken forward towards their substantiation, they threaten to cut-the-runway for hardware vendors whose roadmaps are not able to match their benefits. 

Challenges Ahead – Modular Scaling

IBM will only be able to support a few thousand physical qubits on a single module so they need to develop multi-module interconnected systems to achieve large scales. Their goal is to demonstrate classical connections between three Heron modules next year and then implement quantum connections between modules in 2025. They will use microwave links for the connections.

GQI welcomes the clarity with which IBM has described the different types of interconnects it plans to develop at different stages of its roadmap. C-couplers to enhance long-range connectivity on chip; M-couplers to directly connect modules; L-couplers to implement meter scale microwave cryogenic connections). However GQI still views this as a challenging area of IBM’s roadmap (and that of many other players). Typically a trade-off exists between the fidelity of the link and rate with which entanglement can be shared across modules. We await a clear demonstration of which technologies can adequately trade off of these factors to support practical computing applications. IBM’s proposed solutions, C, M & L couplers all remain a new and still little developed technology. Patents could become an important aspect of the battle here.

Modular scaling based on relatively modest module sizes will also quickly lead to very large footprint machines. An additional consideration is securing the He-3 that will be needed for all the dilution fridges required. This is just one example of the many supply chain issues that GQI identifies the sector must move to address.

The Roadmap to 2033+

One thing to understand is that it takes IBM some time to implement all these innovations and they will have one of the advanced machines in their lab for a long time before they make it available to users. During this period they will measure the performance and optimize the design to achieve best performance. In order to help the quantum community better understand their roadmaps they are now showing both an Innovation roadmap that indicates which innovations they are working on and when, and also an Availability roadmap that shows when the technology will be available to users. Often the difference can be one year.

The figure below shows these roadmaps through 2033. It’s noteworthy to understand that the processors shown in the black boxes are NISQ level machines and would not support error correction while the Starling and Blue Jay series in the gray boxes would support error correction for fault tolerant computing.

Also shown in the roadmaps are the software developments they expect to have to make it easier to use the system and achieve higher performance. One software function expected to be released in 2024 is Qiskit Patterns which would allow a user to map a problem to quantum form, translate it to a quantum circuit, execute the quantum circuit, and process the results. Another significant function slated for release in 2025 is Qiskit Functions which will provide assistance for quantum simulations, quantum kernels, and optimization.

Chart Showing IBM’s Quantum Processor Roadmap through 2033
Chart Showing Additional Details of IBM’s Systems Implementing Error Correction

Earlier this year, IBM announced that they were providing funding of $100 million to the Universities of Chicago and University of Tokyo to help them develop a 100,000 physical qubit quantum processor by 2033. These 100,000 physical qubits will form 2,000 logical qubits with error correction and be able to perform 1 billion operations before an error occurs, an improvement of over 200,000 times from what we have today.  This will provide them with additional resources in the areas of quantum communication, quantum middleware, quantum algorithms & error correction, and components with the necessary supply chain to implement this processor, codenamed Blue Jay. Now you can see the intermediate milestones and technologies they will be developing over the next 10 years to get to this goal.

For all parties with interests in the quantum sector, not least established major tech vendors, GQI believes deciding which global quantum ecosystems to align with and invest in will be a key driver of long term success.

IBM’s Quantum Summit is a yearly occurrence and we will report next year on their latest updates and how well they were able to meet their goals in 2024.

For additional information about IBM’s announcement, you can view their press release here, a blog article discussing Quantum Utility and the IBM System Two here, a blog article about Qiskit version 1.0 here, and another blog article that describes a new developer tool called Qiskit Patterns here.

December 4, 2023