We’ve seen a great many quantum computing presentations that show an almost obligatory slide titled “Moore’s Law is Ending”. It is meant to convey the notion that quantum computing will be required in order to continue the historical increase in computing power.
However, these presentations are missing something important. Although Moore’s Law is indeed ending, in recent years there has been increased development of new classical computing architectures that will continue to drive improvements in classical computing power.
For most of computing history, processors have been based upon what is called the Von Neumann architecture originally proposed by John Von Neumann in 1945. This architecture called for separate central processor, memory, mass storage, and input-output units connected together by data busses. Although computers have certainly increased in power over the past 70 years, these increases have been driven by increases in transistor speed, additional parallelism in the central processor, creation of a caching hierarchies in the memory subsystem, and utilizing many CPU cores and processing units in parallel. All of these advances were enabled by the increasing capabilities of semiconductor technology as indicated by Moore’s Law. But the computers were still all based upon Von Neumann’s original architecture design.
However much like Moore’s Law, some people now believe that the Von Neumann architecture is hitting a wall and alternative approaches are being explored to replace it. Examples include computation units built from graphics processing units (GPUs), Memory-Driven Computing, neuromorphic computing, computational storage, hyperdimensional computing, and Memcomputing, and reconfigurable computing using FPGAs. Some of these have already shown significant performance increases and have demonstrated significant advantages for certain applications over classical Von Neumann architecture computers.
We expect that significant progress in these new architectures will continue over the next decade or two and that innovations in these architectures will compensate for any slowdowns in semiconductor technology. So for those working on demonstrating quantum advantage, they need to realize that it is a moving target. Problems that seem very difficult to solve classically today, may become more solvable with one of these new architectures, even though the new architectures don’t utilize quantum mechanical principles.
However, there still appears to be a basic advantage in quantum computing that may allow it to win in the longer term. Although, the new classical architectures will provide significant performance improvement, they will still all tend to scale in performance linearly whereas a quantum computer will scale exponentially.
The best way to show this is with a hypothetical (and not necessarily to scale) example. Consider three data center managers named Allen, Bill, and Cindy who manage different data centers to solve very high performance computational problems. Allen’s data center utilizes 5000 of the latest Intel Cascade Lake Xeon Platinum 9200 based servers to provide huge amounts of computing power. Bill’s data center utilizes 500 of the latest NVidia Volta Tensor Core GPU’s that also provide very high performance. And Cindy’s data center has a single quantum computer with 50 qubits.
As the end users start submitting larger and larger problems to the data centers, they begin to realize that even higher performance is needed. So Allen, Bill, and Cindy go to their management and request additional budgets to double the performance of their data centers. Allen gets approval to increase his budget by 100% to upgrade his Intel based servers from 5,000 to 10,000. Bill also gets approval to increase his budget by 100% to utilize go from 500 to 1,000 Nvidia GPU’s. But because Cindy is using quantum technology, she only needs to increase her data center’s quantum computer from 50 to 51 qubits to achieve a similar 2X increase in performance.
It should be noted that some problems are not amenable for solution on any quantum computer. So the classical based data centers will always be required. But for those problems that do have quantum algorithms available, the exponential scaling factor will ultimately allow quantum computers to show an advantage even if it takes longer to achieve this than people expect.