Google received a lot of press last week with the article released on Nature regarding a new architecture for quantum computing that combines the advantages of adiabatic quantum computing (AQC) and gate level quantum approaches. This is certainly an innovative development because previously no other company was looking at an implementation that combined the two. The advantages of the adiabatic approach (sometimes called quantum annealing) is that it is good for solving optimization problems, which are very important and difficult problems in computing, and it is well suited to programmability by changing the strength and coupling parameters. And the advantages of the gate level approach is that you can apply error correction techniques to improve accuracy and resistance to decoherence which might steer you to the wrong answer.
However, before we write the obituary for D-Wave, there are several other factors that were not mentioned in the popular press. The Google paper described a 9 qubit proof-of-concept implementation which is far smaller than D-Wave’s current 1097 qubit Washington chip. Scaling up takes a lot of time and effort and we estimate it may take Google another couple of years to get to the next generation of 40-50 qubits. Moreover, while the theoretical benefit of a digitized AQC is that error correction can be added, this initial implementation does not currently have any error correction. The overhead for adding error correction is at least 2X so an implementation that offers 40 logical qubits will require 80 physical qubits. This will add to the die size, manufacturing complexity, and cost of the underlying chip. And remember, the AQC approach is primarily meant to solve optimization problems and in many cases having 100% accuracy may not be necessary. For example, let’s suppose that the computer is solving a traveling salesman problem for someone visiting all the major cities on the West Coast of the U.S. Would it really matter if the computer believes the distance between Los Angeles and San Francisco is 363 miles instead of the 382 miles it really is? Probably not. It is likely the computer would end up with the city to city travel order anyway. (Note that there are other problems where this type of error would be a major problem. For example, if you are using Shor’s algorithm to factor a 2048 bit number, a 5% error is a non-starter. But factoring was never meant for an AQC machine.)
One other factor that I think is holding back the usage of quantum computing is the challenge of mapping real world problems into the available architectures. Three factors affect this:
1. The internal qubit to qubit connection topology in the machine
2. The availability of higher level software that can map high level algorithms into machine instructions
3. Application developers who can help end users implement solutions that can run on the machines.
Although it is too early to tell for sure, the digitalized AQC approach may eventually have some advantages in providing more flexible topologies over an analog approach. However, it is clear that D-Wave is working hard to develop the higher level software needed as well as recruit application developers to help provide solutions that run on their machines. In the computing world, companies can sometimes achieve critical mass and create what is known as a standardized “platform”. One of the best known platforms is the “Wintel” platform created by Microsoft and Intel that has dominated the PC market for decades. Sometimes these platforms may achieve leading revenue share even though they may not be the best technical solution. So it will be interesting to see if D-Wave, with their head start, is able to make their platform a standard.
Nonetheless, the digitized AQC approach has many merits and we welcome this development. We are sure it will be important for quantum computing in years to come. Just don’t hold your breath and expect to see any machines for sale with this feature by next week.