By Dr Chris Mansell, Senior Scientific Writer at Terra Quantum

Shown below are summaries of a few interesting research papers in quantum technology that we have seen over the past month.

Hardware

Title: Logical quantum processor based on reconfigurable atom arrays
Organizations: Harvard University; QuEra Computing Inc.; NIST/University of Maryland; Massachusetts Institute of Technology
Rydberg interactions enable logic gates to be performed between ultracold atoms. The infidelity of such gates has decreased exponentially from above 0.1 over a decade ago to well below 0.01 currently. Some of the recent success has arisen from making the atoms colder, placing them closer together and subjecting them to fewer, higher-power laser pulses. In this paper, the high-fidelity gates were crucial, as was the ability to coherently move many atoms in parallel from one end of the atom array to the other. The headline result is that 280 atoms were used to make up to 48 logical qubits and demonstrate various quantum error correction protocols. Future work will involve technical improvements to the system that may allow repetitive error correction to be performed during a logical quantum algorithm.
Link: https://www.nature.com/articles/s41586-023-06927-3

Title: Breaking the Entangling Gate Speed Limit for Trapped-Ion Qubits Using a Phase-Stable Standing Wave
Organization: University of Oxford
One of the DiVincenzo criteria for qubits is that their coherence time must be much longer than the average duration of their logic gates. Experimental measurements of trapped-ion qubits put their coherence time to gate duration ratio at about one million. This is better than the other leading quantum computing platforms by several orders of magnitude. However, all other things being equal, shorter gate durations are better. Even though the logic gates between trapped ions have very high fidelities, they are considerably slower than their rivals. In this paper, the researchers use standing waves to experimentally implement fast gates between ions. This approach could be extended to mitigate the leading source of error in one of their prior schemes for fast, high-fidelity gates based on Raman transitions.
Link: https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.131.220601

Title: Vacuum Beam Guide for Large Scale Quantum Networks
Organizations: University of Chicago; LIGO Laboratory, California Institute of Technology; Stanford University

Which is more impressive: the Laser Interferometer Gravitational-Wave Observatory(LIGO) that allows us to detect ripples in the fabric of space-time or the billions of kilometers of optical fiber that have transformed the way information is sent across the globe? Quantum technologists are always on the lookout for whatever is most effective and in the case of quantum communication, the attenuation coefficient of today’s optical fibers is too high. With the Earth’s turbulent atmosphere limiting the range of satellite-based quantum channels, the authors of this preprint suggest that the way forward is to repurpose the incredible tools and techniques employed by LIGO. In particular, they calculate that high-precision optical elements placed into a long-distance vacuum enclosure would let quantum information propagate with an attenuation coefficient three orders of magnitude smaller than fibers. With such little attenuation, a continental-scale quantum network could be established without relying on quantum repeaters.
Link: https://arxiv.org/abs/2312.09372

Title: Reconstructing Complex States of a 20-Qubit Quantum Simulator
Organizations: University of Calgary; University of Oxford; Russian Quantum Center; Technology Innovation Institute, Abu Dhabi; Universität Innsbruck; Alpine Quantum Technologies GmbH; Österreichische Akademie der Wissenschaften; National University of Science and Technology “MISIS”
Quantum state tomography is the task of fully characterising an unknown quantum state given many independent copies of it. When there is no guarantee that this unknown state belongs to a particular class of states, exponentially many copies are needed. In practice, however, an experimenter will typically know that the quantum state was produced by, say, a one-dimensional quantum processor with nearest-neighbour interactions.
In such a scenario, one could describe the unknown state using a parameterized ansatz. In this paper, the researchers considered data from a 20-qubit ion trap experiment and employed ansatzes based on either neural networks or matrix product states. They found that the latter was superior for the data in question. They attributed this to a couple of positive attributes of the ansatz but cautioned that it would not work well if the quantum processor were to produce states with volume-law entanglement scaling. 
Link: https://journals.aps.org/prxquantum/abstract/10.1103/PRXQuantum.4.040345

Software

Title: Does provable absence of barren plateaus imply classical simulability? Or, why we need to rethink variational quantum computing
Organizations: Los Alamos National Laboratory; Oak Ridge; Universidad Nacional de La Plata; University of Strathclyde; Ecole Polytechnique Fédérale de Lausanne; Donostia International Physics Center; University of Waterloo; Vector Institute; Chulalongkorn University; Caltech
At first glance, variational quantum algorithms appear easier to design than conventional algorithms. However, experience has shown that barren plateaus are so prevalent that this is not the case. As the title of the paper suggests, we may need to rethink variational quantum computing. The authors of the paper argue that making a variational quantum algorithm easier to train inadvertently makes it easier to classically simulate. There are a few caveats. Firstly, the classical computer simulating the algorithm may still need access to some data generated by an initial run of a quantum computer.
This means that even though the variational aspect of the model is no longer needed, quantum computers still play an important role. Secondly, there is the possibility that, just like with classical neural networks, things will work better in practice than they do in theory. In particular, the results of the paper only apply to models where we can prove that there aren’t barren plateaus. This means that when we cannot prove anything about the model, it may be easy to train but nevertheless hard to simulate.
Link: https://arxiv.org/abs/2312.09121

Title: Quantum Multiple Kernel Learning in Financial Classification Tasks
Organizations: IBM; HSBC
Popular machine learning methods usually process one data point at a time (e.g., supervised learning using either a quantum circuit or a classical neural network). However, kernel-based machine learning protocols accept pairs of data points and evaluate how similar they are. This makes them well suited to classification tasks because the data points in one class should be similar to each other but different from those belonging to another class. In this paper, a quantum multiple kernel algorithm was used for financial classification tasks. It was run on IBM quantum hardware and error mitigation was employed, with promising results being demonstrated for up to 20 qubits.
Link: https://arxiv.org/abs/2312.00260

Title: Statistical Phase Estimation and Error Mitigation on a Superconducting Quantum Processor
Organizations: Riverlane; University of Sheffield; Astex Pharmaceuticals
Quantum phase estimation is a quantum algorithm for calculating the ground-state energy of molecules. The authors of this paper proposed a new way to do this that improves the accuracy by one to two orders of magnitude compared to earlier theoretical results. Using seven qubits of a Rigetti superconducting quantum processor, they applied their method to chemicals that have active spaces with up to four electrons in four spatial orbitals. They used error mitigation techniques and found the correct energies to within chemical precision.
Link: https://journals.aps.org/prxquantum/abstract/10.1103/PRXQuantum.4.040341

Title: Quantum Optimization: Potential, Challenges, and the Path Forward
Organizations: Quantum Optimization Working Group
Computers are devices that reliably carry out logical, mathematical operations and, as such, their capabilities are extremely amenable to theoretical analysis. They are also highly engineered, physical systems that can be empirically tested in numerous ways. Both approaches are responsible for the computing industry’s incredible progress. This review paper on quantum optimization reviews insights from computational complexity theory before discussing the practicalities of noisy quantum hardware and the importance of benchmarks. From finance to sustainability, the impact that improved optimization algorithms could have on our world at large is clear. The likelihood that these improvements could come from quantum algorithms is the central question that this paper comprehensively explores.
Link: https://arxiv.org/abs/2312.02279

December 23, 2023