Can Researchers Exponentially Improve Quantity and Quality at the Same time?

We read with interest a recent article in Nature Magazine by Joseph Emerson of Quantum Benchmark and IQC that indicated entangling gate performance in quantum computers was doubling every 10 months. Previously, we had published an article that indicated that quantum computer qubit count is expected to double every 1-2 years similar to how semiconductor transistor counts increased as predicted by Moore’s Law.

But an important question should now be raised. Can researchers exponentially improve both quality and quantity at the same time? The potential problem is that increasing the number of qubits on a chip can have a negative affect on qubit coherence time and gate fidelities. Placing more qubits on a chip will likely result in larger die sizes which can result in slightly higher process variability across the die. More qubits also increases the possibility of increased cross-talk between qubits. In addition, maintaining the same consistent level of temperature control and magnetic shielding becomes more challenging as the die sizes get larger.

We don’t know the answers to these questions, but we would point out that semiconductor technology did not have the same level of quality challenges brought about by quantum physics when it was following Moore’s Law and scaling up to the billions of transistors per chip that we have today. Sure there were miscellaneous contamination issues and occasional reliability concerns that needed to be addressed, but semiconductor technology did not require exponential rates of quality improvements with each generation. Even in the early days, small scale integrated circuits were quite reliable.

Since the overall power of a quantum computer is dependent upon a combination of both qubit count and qubit quality, we are certain that overall this combination will increase at a rapid rate. But how much will be attributable to one of these factors versus the others is still too early for us to tell.