The full report is available for purchase on the GQI website at this link CLICK HERE
We do now have real quantum applications, though these are currently modest compared to our expectations. As early adopter interest grows, how are the prospects for commercially relevant algorithms shaping up? We examine powerful challenges, and also powerful new opportunities.
The strategic algorithms playing field
Quantum computers are set to be much slower than conventional computers when measured on just their clock speed. The real advantage they offer comes from the unique quantum algorithms they promise to run. Much hype has surrounded the claims and expectations for what such algorithms can achieve.
Some quantum algorithms offer remarkable (exponential) speedups, others more modest (quadratic) speedups. Some are heuristics whose benefits must be evaluated by trial and error.
It is not possible to divorce algorithms from the specs of the hardware on which they are intended to run. Academics have been developing quantum algorithms for 30 years. Early quantum hardware varied from non-existent to very poor. However, it was known that beyond a certain fidelity threshold gate-model architectures could in principle use quantum error correction to turn noisy physical qubits into better logical qubits. It was therefore natural for algorithm theorists to focus on the abstraction of fault tolerant quantum computation (FTQC).
The most famous gate-model quantum primitives all date from this era: the quantum fourier transform, (QFT) used, for example, in Shor’s algorithm for factoring, quantum phase estimation (QPE) used, for example, in the HHL algorithm for linear algebra and in quantum simulation; amplitude amplification (AA) used, for example, in Grover’s algorithm for unstructured search and discrete time quantum walks for Monte Carlo.
QFT and QPE typically offer an exponential advantage though sometimes with important caveats. AE typically offers a quadratic advantage, also typically with caveats. These primitives typically require implementation on the high-depth quantum circuits implicitly provided by FTQC.
More recently it has been shown that all of these primitives can be understood as arising from one unified framework, a grand unified quantum primitive – the quantum singular value transformation (QSVT).
When physical devices improved to the point that they were able to demonstrate they had some ‘beyond classical’ capabilities, a new focus naturally developed: what might we be able to do directly with these noisy intermediate scale quantum (NISQ) devices?
For gate-model approaches limited fidelities mean that high-depth quantum circuits are not feasible. Variational quantum algorithms (VQAs) have emerged as the most promising body of techniques aimed at NISQ devices, combining low-depth quantum circuits with classical processing in an iterative loop.
Quantum annealing (QA) can be seen as a NISQ era implementation of adiabatic quantum computing, an alternative to gate-model computation.
These techniques are heuristics. In contrast to the full scale techniques described above, we have no guarantee of their accuracy, performance or the speedup they may offer.
As developer roadmaps now also increasingly project forward to the introduction of fault tolerant techniques, it is increasingly clear that early FTQC devices still will not offer the specs to run all full-scale FTQC algorithms. A focus on algorithms suitable for this new middle ground is an increasingly important focus.
January 14, 2023