A major topic of discussion in the classical computing world is On-Premise versus Cloud Computing. Major enterprises that historically were doing their data processing within their own facilities have been shifting more of their IT load over the past decade to specialized cloud services like Amazon Web Services, Microsoft Azure, Google Cloud and others. And some enterprises are also using a concept called hybrid cloud which uses both private computer resources belonging to the enterprise along with the public cloud services.

Most of the current quantum computing activity is either public cloud based or sometimes a private cloud such as the D-Wave processor installed at NASA’s Ames Research Center and used by NASA and the USRA (Universities Space Research Association). But without exception these installations are located in well controlled environments and not just any random data center.

There are several reasons for this. The first is just the sheer cost of the quantum computers. Sales prices for these machines are in the $10-$15 million range and they also require yearly maintenance contracts with the supplier that can top $1 million or more per year. Those price tags are well beyond the budget of most organizations. Even those that could afford this price would want to achieve close to a 100% utilization of the machine and process jobs 24/7 so they could get a good return on their investment. But few organizations, if any, have the level of workload needed to get close to that utilization rate.

Besides the financial reasons there are also many operational reasons why it would not make sense for most enterprises to require an on-premise quantum computer. First, many of these machines are being constantly upgraded and improved with new firmware and other minor changes. Many of them require frequent calibrations that may require on-site maintenance personnel to perform. If a spare part is needed there is the standard delivery logistics that one has to plan for.  And finally, one should remember that processors need to be opened up for repair and utilize dilution refrigerators have very long warm up and cool down cycles. These machines cannot go from temperatures of 15 millikelvin to room temperature and back again in a few minutes.  That process typically takes several days during which the computer is offline and unavailable.

But despite all of these challenges there are arguments why some organizations might still want to have an on-premise quantum computer. Chief among them is data security. For those organizations that work with classified data or are developing a computational chemistry algorithm to discover a new proprietary drug or material, there is the concern that their calculations or data could leak outside their organization. These organizations could be worried that some technician at a cloud provider could intercept their quantum program and data while it is running on the cloud and use it for unauthorized purposes. Although this problem may also be solved by a technique that provides a way for a quantum computation to be performed while the data is still encrypted called quantum homomorphic encryption, research into this is still very early and may not be mature for a long time.

Another argument for having an on-premise computer is just to provide a way to bypass the job queues that may occur with a shared quantum computer at the cloud provider. The cloud providers are also worried about achieving a high utilization rate for these expensive machines and want to keep them as loaded as possible. However, this can sometimes result in long waits of several hours to receive the results of a run after the job had been initially submitted. This can slow down any quantum software development effort, especially when a programmer needs to submit multiple runs in order to debug a program or try out different parameters to see which one works the best. Some quantum hardware suppliers are providing the capability to reserve time on a machine so that a user can have sole usage of it during this period, but that is still not the same as having your own machine that you can use any time you want.

So although the bulk of quantum access will be via the cloud in the near future, there will always be some demand to have a machine on-premise. We expect any demand to come initially from government agencies that are working with classified data and willing to pay whatever is needed to support an on-premise, secure machine.

But the question remains as to whether the demand for on-premise quantum computers will ever become more prevalent. We think it will, but it will take many years before it becomes more common. Several breakthroughs will need to occur in the areas of cost, reliability, form factor, and user workload for this to occur.

One of the most costly components in a quantum computer is the dilution refrigerator which costs over $500,000 and is required for all superconducting qubit implementations as many spin qubit implementations. It also takes up a significant amount of space. Many companies are working on developing other approaches such as photonic, ion traps, and other approaches which can eliminate this costs component and save space. Even a technology that can run at a temperature of 3 kelvin instead of 15 millikelvin can reduce cooling costs by over an order of magnitude since it can be cooled by liquid helium or other methods than a dilution refrigerator.

Even with the many quantum computing advancements expected over the next decade, we don’t expect that on-premise quantum computing will be the dominant mode in the foreseeable future. Although on-premise quantum computing will gain share over time, the cloud computing model is here to stay and will continue to be very popular in the classical computing world. And we have no reason to believe that quantum computing will be any different.

January 6, 2020