It has long been our thesis that one can gain some good insights into how quantum computing technology will develop by reviewing what happened in the classical computing industry and envisioning whether similar things could happen with quantum computing. One of the areas where we see potential similarities is in the use of multi-processing technology.

In the initial stages of classical computing, processor systems consisted of just one processor.  In order to achieve higher and higher performance, engineers found ways to increase it speed through the use of faster transistors, pipelining internal stages, and adding dedicated hardware to perform functionality previously implemented in software.

But at some point, the market continued to demand more and more performance and the computer designers ran out of tricks to provide computing capability with just a single processor that had the performance levels needed by their users. So computer designers turned to multi-processing to provide the increasing levels of computing capability. By having two or more processors working side-by-side, the amount of work that can be processed can be increased without requiring too much additional processor engineering work. But there are some drawbacks to this approach too.  Software programs needed to be re-written to take advantage of the new parallel architectures.  No longer could programmers view a problem as a linear step-by-step approach to writing their algorithms.  They needed to figure out ways of parallelizing their algorithms without running into data access conflicts. Fast communication interfaces between the multiple processors needed to be developed to reduce potential interprocessor communication bottlenecks.  Even so, there can be limits to how effective this technique can be because the performance increase available becomes less and less with each subsequent processor addition.

But slowly, the use of multiprocessor architectures became more and more prevalent.  Initially, this occurred at the CPU design level, but not at the chip level.  More recently, multi-core microprocessors became popular as the number of transistors available to be fit on one chip has continued to increase.  It is now common to have data centers with thousands of processors in operation that can work together to accomplish amazing tasks such as providing a response to a search term inquiry that encompasses the entire internet within milliseconds.

It is quite possible that quantum computers will go down a similar multi-processor path. Designing a quantum machine not only has a many theoretical and electrical engineering challenges, but mechanical engineering challenges as well. As the number of qubits in a machine is increased, the number of control cables needs to increase as well.  Not only does the space to fit them become a challenge, but more cables that need to be routed from room temperature control electronics to the millikelvin temperatures inside the dilution refrigerator can present additional thermal challenges as well. Another challenge includes expanding the number of qubits on a chip while keeping the gate fidelities the same or even better.  Fabrication process control over a larger area becomes more challenging as the die size increase and problems of crosstalk between qubits could also increase with the number of qubits.

So to solve these problems we believe that quantum computers will turn to a similar solution as classical computing started using a few decades earlier and that is multiprocessing.  And since everything needs a buzzword, let’s coin a new one: NISQ-MP. It is our belief this could become in an interesting solution, perhaps not in the short term, but in the medium term.

There are certain technical developments that will need to happen to make this a reality.  To make NISQ-MP effective, the neighboring quantum computers will need to communicate via entangled qubits.   And this is where the technology being developed for the quantum internet comes to play. The interprocessor communication may be easiest for those quantum computers based upon photonics, but might also be possible for those machines based upon other technologies, such as superconducting.  In those situations, some mechanism will need to be developed to convert a superconducting qubit to a photon based qubit for transmission.  And there are researchers looking at how to do this.

So in the future one can envision quantum data centers that may contain dozens or even hundreds or thousands of machines linked together by a quantum internet all within the same large room.  But rather than communicating over distances of hundreds of kilometers, the average distance between nodes may only be a few meters. This short distance removes a significant problem because fiber optic cable losses will be negligible and engineers won’t need to solve the challenges associated with a long distance quantum internet such as wavelength conversion and requirements for quantum repeaters. So when thinking of a quantum internet, don’t just assume that it will be only used over long distances.  The short distance links may also be very important and become a key ingredient to make NISQ-MP a reality.

November 7, 2019