Better Therapeutics with Near-Term Quantum Computers

We often hear of people making prognostications on when a quantum computer will start solving real world valuable problems. And often the answer we see is about 3-5 years which is mostly based upon someone’s gut feel rather than a more thorough rigorous analysis.

So we were recently interested in seeing a paper posted by Qulab that provides a more detailed study of exactly how many qubits would be required to perform a molecular simulation of certain biomolecules. Such simulations would be helpful to improve computational drug discovery and CRISPR gene editing. The team used Microsoft’s Q# program language and associated quantum computing chemistry package to estimate the number of qubits to simulate twenty different dipeptide molecules. They estimate that this could be done with anywhere between 88 and 276 qubits on a non-error corrected NISQ machine, depending upon the particular dipeptide. In the paper, they described some of the techniques, such as active space reduction that can be used to reduce the qubit requirements.

This research is by no means finished. Additional work still needs to be performed to figure out how to map these algorithms to actual quantum hardware. There are certain considerations such as accounting for the native gate set of a particular machine, adjusting for the particular qubit connectivity a machine may have, adding ancillary qubits for error correction and other things. But if one views the estimation we made earlier in our article Applying Moore’s Law to Quantum Qubits we would expect that machines with 300-1000 physical qubits will be available in the next 2-4 years. In addition, further work is need to better understand the gate depth needed as well the number of iterations the VQE (Variational Quantum Eigensolver) algorithm will need. These will significantly impact the length of time it may take to calculate the solution.

For more details, you can read the paper posted an arXiv here, a Medium article that summarized the results here, and a press release announcing the paper here.

7/19/2019
2 Comments
QuantumX
@ 4:27 am

Sorry but there is something that I don’t get. Current classical computers are capable of simulate quantum computers with logical qubits up to 50/70. I guess they could easily simulate 300 or more noisy qubit. But then what’s the advantage?

Reply
    @ 12:33 pm

    None of the 50+ qubit classical simulations being reported are simulating any form of error correction, such as the surface code. So the Logical:Physical ratio can be thought of as being 1:1. A direct simulation would require one to store about 2^50 complex numbers which is close to the limit of the Summit supercomputer. See the paper at https://arxiv.org/pdf/1905.00444.pdf by a team from Google, NASA, ORNL, and U of Illinois for details.

    Because memory and processing requirements increase exponentially with the number of qubits, a 300 qubit simulation would require storing roughly 2^300 complex numbers which is more than the estimated number of atoms in the universe. The fact that the qubits might be noisy would not be of much help.

    Doug Finke
    Managing Editor

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.