Applying Moore’s Law to Quantum Qubits



In 1965, Gordon Moore predicted that the number of transistors on a silicon chip would double every year. This was based upon empirical evidence that he witnessed in the early years of the semiconductor industry. In 1975, he modified his prediction to indicate a doubling every two years. That held for the next forty years until Brian Krzanich announced last year that Intel was slowing down the rate to a new process generation every 2.5 years.

So the question comes up whether Moore’s Law can also be applied to quantum qubits. And early evidence suggests that indeed it may. If we take this as an assumption we can make rough forecasts for qubit capacities in the coming years and show when a quantum computer can be used to solve certain meaningful problems. The resulting graph is shown below:
qubit-projections-verrsus-algorithm-requirements-september-12-2016

The upward sloping lines represent the projections for qubit density forecasts for various technologies. The adiabatic line would be a prediction for quantum annealing machines like the D-Wave computers. These have followed the Moore’s Law prediction pretty closely so far with the D-Wave 1 at 128 qubits in 2011, the D-Wave 2 at 512 qubits in 2013, the D-Wave 2X at 1097 qubits in 2015, and a 2048 qubit machine in 2017. Since this is not a universal quantum computer and has no error correction, the qubits are easier to build and the densities can be much higher.

The upward sloping lines labelled Physical or Logical represent various types of gate-level quantum computers. The Physical curve predicts the number of physical qubits that will be available.  There is less historical data on these, but there are indications that these will progress rapidly too. As examples, IBM has a 5 qubit machine that is available in the cloud through the IBM Quantum Experience and Google has demonstrated a 9 qubit machine. Both of these companies, and others have indicated that these densities will increase rapidly so the Physical curve maintains the improvement rate of a doubling every year for the next 10 years and a doubling every two years thereafter.

However, as important as the number of qubits are the quality of the qubits and the amount of error correction that will be required. The expectation is that gate level quantum computers will need substantial amount of error correction so that logical qubits will consist of a number of physical qubits. The estimates for the ration of physical to logical qubits can vary widely depending upon the technology used and the fidelity of the individual qubits. I have seen estimates ranging from about 10:1 for topological qubits to 1000:1 or more for other types. So the three curves plotting the number of logical qubits would represent a range of quantum computers with error correction.

The final set of lines are the double horizontal lines that show how many logical qubits would be needed to implement various algorithms. By viewing where these lines cross a specific quantum computer technology curve, one can see a rough estimate of when the algorithm may be solvable. I have shown four different types of algorithms on the graph. These are:

1.      The number of logical qubits needed to surpass the capabilities of a classical computer.  This is assumed to be about 50 qubits because it would require the classical computer to compute 250 states which is an overwhelming number.

2.      The number of logical qubits needed to perform a simulation on quantum chemistry.  Estimates are that this would require at least 100 qubits.

3.      The number of logical qubits needed to implement a machine learning algorithm on a quantum computer. For this graph I am estimating it would take about 1000 qubits, although this could vary widely.

4.      Finally, the number of logical qubits needed to factor a 2048 bit number in order to break RSA encryption with a 2048 bit key. This is estimated to take at least 4000 qubits, but could be more depending upon the algorithm used.

By using these assumptions and studying the graph one can come up with some interesting conclusions. First, implementations that require heavy error correction will add considerably to the requirements and time before an algorithm can run successfully. For example, we estimate that a quantum computer with 4000 physical qubits will be built by 2023. If those qubits were perfect and required no error correction (i.e. a 1:1 logical to physical ratio) then the 2048 bit number could be factored as early as 2023. However, if heavy error correction is required with a 1000:1 logical to physical ratio, then this factoring could not be accomplished until about 2041 or about 18 years later.

So in order to make quantum computing as useful as possible as early as possible, there are three things that the industry needs to work on:

1.      Increasing the number of physical qubits.

2.      Increasing the quality of the qubits so that good results can be obtained with smaller error correction codes that have less overhead.

3.      Finding robust algorithms that may still be able to provide useful answers even when errors occur.  These algorithms could then run on machines with fewer physical qubits.

We would welcome any comments you may have on this analysis. Although some of you may have different opinions on a few of the assumptions we’ve made for qubit growth or algorithm requirements, we think the methodology is good and can easily recalculate the analysis by plugging in the different assumptions. Please send any feedback to info@quantumcomputingreport.com.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.