To Full-Stack or Not To Full-Stack; That is the Question



One of the buzzwords that you will see in quantum computing is the term “Full-Stack”.  To those of you who are not familiar with this term, it means a company that intends on providing by themselves the complete quantum computing solutions including the chip, system, software, and cloud access.

This approach is similar to the pre-1981 classical computing industry where companies like IBM, CDC, Univac, Digital Equipment and others each offered their own incompatible computer systems with their own operating systems, compilers, and utility programs.  These were often called “closed systems”.  After 1981, the classical computing industry changed dramatically with the advent of the Personal Computer and the formation of the renowned Wintel (Microsoft-Intel) alliance.  After that companies offering personal computers used what has been called the “open system” approach.  They started to focus their offerings on specific pieces of the solutions, such as hardware or applications software or peripherals.  Systems from multiple manufacturers became standardized and compatible, prices dropped, shipment volumes increased and the industry boomed.

Examples of companies that appears to pursue the full-stack quantum approach today including IBM, Rigetti, Microsoft, D-Wave, and perhaps a few others.  While companies like Silicon Quantum Computing, Intel, and many of the software companies are only working to provide specific pieces of the solution and then partnering with others to provide the total experience to the end user.

There are arguments both pro and con for going full-stack.  To start with the pros, the most common reasons stated for going full-stack is that it improves the turn-around time for new development and that it also allows a company to make better optimizations in the hardware-software interface.  Companies will argue that by having full control of the wafer fab, system integration, and software they can try different approaches and iterate much more quickly to improve their product.  Working with vendors or partners will slow them down because the external organizations may have other priorities and require more steps to communicate with.  Similarly by having control of all the hardware, software, and system access they can better optimize the complete solution for the best performance and functionality.  They can make tradeoffs in one subsystem and compensate for it in another subsystem.  For example, Rigetti is promoting a classical-quantum hybrid approach for solving problems with features embedded in their hardware architecture to achieve this.  They are developing their own software to take advantage of this capability and the support for this capability might not be as good if they relied on external software.

However, there are also significant cons with going the full-stack approach.  The most significant is that it will require a much larger amount of engineering resources to succeed.   Not only will this require a larger investment, but it also is a more significant management challenge to keep under control a much larger breadth of activity.  Companies will need to hire engineers and manage activities in such diverse areas as wafer fab, system mechanical engineering, electronic control systems, operating system software, applications software, cloud access software, etc.  Even a large company like Microsoft may have a challenge in some areas.  They will need a wafer fab to build their topological qubits and Microsoft has never previously been in the fabrication business.  It will be interesting to see whether Microsoft decides to build their own fab or work with an external foundry to manufacture their chips.

Savvy players can sometimes take advantage of external resources to gain an advantage.   In the early days of the personal computer, Compaq Computer achieved great success by cloning the IBM Personal Computer and offering a system that was both faster and cheaper than those offered by IBM.   They achieved this because they focussed all their efforts on the hardware and did not have to worry about the software.  Not so great perhaps for IBM, but it was a huge success for Compaq as well as users who benefited from the increased competition.

So our expectation is that you will see both approaches for a while.  Quantum technology is still quite new and the problem with trying to standardize an architecture and create an open system standard at this point is that it may be premature.  An attempt to create a standard now may end up standardizing the wrong thing.  However, we are strong believers in partnerships and would encourage the quantum companies to establish those partnerships wherever they can.  No one company can do everything all the time and the more we can have different companies work together the quicker we will see the technology develop and the faster the industry will grow.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.