By David Shaw, Fact Based Insight
This article has been republished with permission. The original can be found on the Fact Based Insight web site here.
IBM continues to lead the quantum cloud, but faces real competition to service early quantum applications. Building the future of this market still looks like a long game. It may be more important to pick a partner with the right strategy than just to compare today’s product features.
The quantum revolution relies on much more than just building a new form of computing hardware. Innovative quantum algorithms are equally important, so making these easier to develop and deploy for real business applications will be a central concern. Finding the right framework to cast such maths into program form is also key, but the circuits of gate-model quantum computing and quantum annealing solvers aren’t the only games in town. Error correction is a key part of the quantum story and also needs to find its place. At the lowest level qubits require analogue control, itself a rich seam of challenge and opportunity. There is no layer in the quantum stack that doesn’t deserve our close attention. Fact Based Insight’s simplified view of the quantum stack is inspired by the activities of early players and lessons from the evolution of the modern HPC stack.
Applications – End-use business-oriented applications. For the most part these remain a work in progress. Many players have emphasised early research and user community engagement to develop proof-of-concept and trial applications. Algorithms – Unique quantum approaches to solve various classes of problem. For a detailed discussion see Quantum Algorithms Outlook 2022. Framework – Most early players emphasise circuit-model quantum computation (though there are important variations on this approach). Various providers offer their own way of describing and executing the required quantum circuits. Architecture – Runtime environment co-ordinating compute operations. Includes quantum gates, measurements and tightly coupled classical logic. In the end we can expect kernels optimised for the efficient implementation of quantum error correction, and that co-ordinate access to specialised resources such as magic state factories and QRAM. Control – Low level operations driven by analogue pulses (typically microwave or laser based). Pulse shape and timing is crucial. Advanced protocols are required to optimise gate operations and to suppress crosstalk. Quantum – The actual qubit hardware. For a discussion focussed on the quantum layer see Quantum Hardware Outlook 2022. Simulator – Conventional simulators are a key additional element of the software stack, not just because of the limited performance of current quantum processors, but also to support ongoing programme development and debugging. Note: by design the diagrams supporting this briefing represent many different approaches against one simplified view of the stack. They represent the status of which Fact Based Insight is aware and we caution readers to take them only as a starting point for their own further evaluations.
The development of the quantum software sector is proceeding rapidly. However, Fact Based Insight’s reviews of progress on quantum hardware and quantum algorithms indicate that this is still likely to be a long game. For early adopters and would-be quantum developers, the most important consideration may not be immediately available software product features, but picking the right partners. To understand the current state of play, investors have to appreciate the challenges and opportunities faced by the early players, and the different longer-term commercial strategies towards which they are evolving.
IBM has been the clear early victor in quantum software’s ‘initial awareness’ phase. It was others who first put quantum processors on the cloud (Jeremy O’Brien at the Univ. of Bristol in 2013 ), but it was the launch of the IBM Quantum Experience in 2016 that really succeeded in driving a step change in engagement. Now renamed to IBM Quantum, it has over 360,000 registered users, with an average of 2.2B circuit shots executed per day in 2021 and typically 25 quantum processors available online at any one time .
IBM Quantum Experience initially focussed on building awareness by offering a simple graphical web interface where users could create (compose) simple quantum programs (circuits) and then run them on early quantum hardware. IBM built on this success to introduce Qiskit, an open-source programming framework suited to scientific and early industry adopter use. Educational resources and events aligned with the programme have been a key emphasis. Its OpenQASM low-level circuit representation language has emerged as a de-facto industry standard. An OpenQASM 3.0 draft spec was launched in 2021 and continues to attract cross-industry interest and support . Proof of concept integrations with AQT and most recently with IonQ demonstrate Qiskit’s applicability to multiple qubit hardware types. Today the IBM Quantum network partners include commercial majors Daimler, ExxonMobil, JP Morgan Chase & Co, Samsung, Goldman Sachs, Accenture, Boeing and LG Electronics, with 130+ members overall. On-premise installations and strategic partnerships include Fraunhofer (Germany), Univ. of Tokyo (Japan), Cleveland Clinic (Healthcare) and Yonsei Univ. (South Korea). This sits neatly alongside IBM’s overall business technology and services offerings.
To really appreciate the strategic aspect of IBM’s thinking, recall that in a narrow sense this global hardware access was not really required. Broadly speaking we can readily simulate quantum processors up to about 40Q. IBM correctly realised that educational and wider ecosystem engagement required the excitement of access to real devices. It also realised that helping people learn their quantum basics would build powerful business momentum. D-Wave has also had significant success with their own early phase strategy. First to launch a commercially available quantum computer in 2011, D-Wave initially suffered a wave of controversy in the sector’s first hype vs realism war. The debate continues today over its distinctive hardware strategy, though now all sides broadly accept the constraints and possibilities. This notwithstanding, Fact Based Insight believes that the success of D-Wave’s distinctive, user-led trails approach deserves attention. While it’s been elusive to get applications into routine production use, Fact Based Insight finds it very easy to believe that clients have seen business value. Businesses need ways to motivate their best and brightest to think about their business problems in new ways. Benefits flow back to the business by multiple means. D-Wave continues to innovate in the battle to keep clients on their platforms for the long-term.
D-Wave Leap cloud and the Ocean development environment provide a platform for business-oriented quantum annealing applications. The suite of tools has grown to include hybrid quantum annealing/classical solvers. The most recent developments to the service emphasise the ease of use of its algorithmic tools and its interface for job tracking and visualisation. The focus on simplicity and robustness is explicitly designed to support production applications. While sticking with quantum annealing for optimisation problems, D-Wave has also announced plans to build gate-model quantum computers to tackle material science and quantum chemistry simulations problems.
Some in the gate-model community may be tempted to interpret delay in getting applications into production as somehow reflecting a weakness specifically in quantum annealing. Fact Based Insight thinks this misses the true lesson. In the absence of error correction and large-scale machines, early gate-model successes will also likely be incremental rather than revolutionary. D-Wave’s experience is actually a reminder of just how difficult it is to implement change in any large client organisation. The professional services sector has long understood this and increasingly stands ready to help. The importance of its future role should not be underestimated.
Early gate-model full-stack players
For many early hardware players in gate-model quantum computing, creating their own full software stack was simply a matter of necessity. However, ‘full-stack’ doesn’t imply an evolved service offering to the extent that IBM has been able to build.
Google – Following its demonstration of ‘quantum supremacy’ in 2019, Google has crystallised its stack around the Cirq framework . A notable addition in 2021 is Stim, a stabilizer circuit simulator unique in focussing on support for quantum error correction research . Rigetti – Pioneered quantum circuit execution in tight classical loops. This VQA friendly technique has only much more recently found its way into IBM Runtimes. Conversely, Rigetti have now followed IBM’s lead by introducing Quil-T to open-up control of their systems down to the pulse level. It’s an extension of Quil-T that is allowing Rigetti devices to be manipulated as 3-level systems, qutrits, rather than just standard qubits . Xanadu – The Strawberry Fields framework supports Xanadu’s distinctive photonic approach to quantum computing . However, their influential algorithm library, PennyLane, is compatible with a wide variety of other gate-model setups . OriginQ – Pioneers of quantum computing in China. Their stack is already proven across both their own superconducting and silicon qubit processors.
Hardware players coming slightly later to the party can now take advantage of an ecosystem that allows them to avoid having to build the upper layers of the software stack. We can still expect further innovation where players look to unlock particular opportunities associated with their qubit platforms.
Pasqal – Introduced Pulser to support pulse level control of its devices. This is doubly relevant for neutral atom devices due to their potential application in analogue quantum simulation.
Taking time to publicly expose, document and support lower layers of your stack imposes its own costs. Such granular features may bring additional innovation and performance to a provider’s local ecosystem. But how many end-users will directly access them? Strategists must weigh the costs and benefits on a case-by-case basis.
Quantum Platform as a Service
No one truly knows what quantum hardware strategy will win out. Early adopters are therefore typically seeking a platform that promises to be, to the maximum degree possible, hardware agnostic. This fits well with the concept of PaaS offerings. Leading conventional cloud computing majors, AWS and Microsoft Azure, have joined IBM to contest the quantum cloud market.
IBM Quantum has always emphasised the access it provides to its own quantum backends. (Qiskit supports access to other provider backends, however these are not currently fully integrated in the true PaaS offering). IBM’s recent addition of VQA friendly Qiskit runtime containers across its processor fleet (to include dynamic mid-circuit measurement & feed-forward from 2022) is the first step in its plans to build a true ‘serverless’ cloud offering. This requires not just classical resources coupled tightly to the kernel, but also resources to which substantive parts of a calculation can be offloaded. In 2021 IBM completed proof of concept demonstrations for circuit knitting and circuit embedding (techniques that reduce the required quantum resources by offloading parts of the calculation to classical resources) and demonstrated how the IBM Cloud Code Engine will be able to combine these. The introduction of this Quantum Serverless model is a key part of IBM’s drive to be ready to support deployment of true applications, with quantum advantage, from 2023. A user comments “IBM has spent a lot of time and money perfecting the end-user web-based front-end which provides sophisticated job management, queueing and processing features, data storage, research groups, resource authentication and authorization, all within a service that handles a hundred thousand users. None of the other vendors come close to the level of sophistication provided here.” IBM seems well positioned to capitalize on their ability to reach large number of users as hardware matures. Other vendors will need to work hard to perfect their service infrastructure to this degree.
Amazon Braket has started to provide a challenge in this market by offering access to D-Wave, IonQ and Rigetti devices, with additions from OQC and QuEra set to follow soon. Support for PennyLane is another notable feature. A variety of powerful simulators are also available. AWS has balanced its relatively late entry on to the quantum stage with a ‘we’re hear to stay’ message. It has backed this up with impressive expert recruitment and investments in the AWS Center for Quantum Computing at Caltech and the Amazon Quantum Solutions Lab. A user comments “The Braket framework and circuit API is less comprehensive than what’s offered by Qiskit, limiting to some degree the sophistication of applications that can currently be run.” Amazon will need to remedy this to be successful. However, almost everyone agrees that quantum computing will always really be a hybrid form of computing. Braket is well placed to exploit the corporate-friendly provisioning flexibility underpinned by AWS’s leadership in the wider cloud market.
Microsoft Azure Quantum is also exploiting its parent’s competitive position in the wider cloud sector. Its Azure Quantum platform is currently in public preview. Microsoft has developed Q# as a dedicated language for authoring quantum algorithms, together with associated library and training resources. The recent addition of support for code authored in the Qiskit and Cirq frameworks is acknowledgement of the influence of these approaches in the wider ecosystem. Microsoft has a long track record in quantum research via its Station Q network. Perhaps more than other major player, Microsoft has tended to view true quantum advantage as a long game. It’s perhaps therefore not a surprise to see a high profile for quantum-inspired digital annealing solutions in Azure Quantum. The Amazon Braket and Azure Quantum offerings are differentiated from IBM’s in that they are providing high-level job management and service tools to complement a growing range of hardware vendors.
Google has a leading position in the quantum sector and an honourable third place in the conventional cloud computing sector. However, it hasn’t yet moved its Quantum Computing Service beyond the ‘early access programme’ stage . This reflects the time it has taken to move its device operations from ‘one off science experiments’ to a regular production service. It’s also not wanted to launch a service based only on processors that can still be simulated classically. .
It’s worth recalling that no cloud accessible processor has yet demonstrated a ‘beyond classical’ calculation
Conventional simulators remain a valid parallel route for quantum research and development. Simulation is typically possible up to about 40Q (more if we can live with simplifications). Advanced simulators can even build in realistic noise models. Simulators are anyway likely to play a key ongoing role in debugging and validation quantum applications.
QLM – Atos has established a complete software stack on top of its dedicated quantum simulator hardware, the Quantum Learning Machine. This is being expanded to also target the execution of quantum-inspired algorithms for digital annealing applications. Academic projects – Several simulators with academic roots, such as ProjectQ and QuEST offer their own root to quantum algorithm experimentation. Simulators are also a key part of the offering in the Quantum Inspire platform. NVIDIA has launched cuQuantum to allow fast quantum simulation on its GPU-based hardware. Now in public beta, adoption should become easier later in 2022 when a container including cuQuantum and Google’s Cirq framework is due to be made available for NVIDIA DGX hardware.
Two notable open-source simulators, Tai-Zhang from Alibaba and HiQsimulator from Huawei, didn’t see further development in 2021. However, Huawei has been active in support of ProjectQ.
Quantum software startups
Perhaps one of the greatest surprises of the original digital revolution was the importance of continuous innovation driven by software startups. In the current quantum revolution, it’s no surprise that we’ve seen a wave of quantum software startups around the globe. A key question is what is their strategy to sustain themselves until the broad commercial use of quantum computing becomes embedded and a more conventional market develops? A crucial question for management and investors is how many years might the business have to run before the promised land is realised? Quantum software startups have been pursuing a variety of strategies.
Finding long-term backing
One strategy is to be acquired/merge with another player with access to the funding required to support the journey for at least the medium term. For example, the scale of funding required to bring quantum hardware to fruition can make software activities look like a bargain. Funding overflowing from red-hot end-use sectors is creating alternative opportunities.
QxBranch were perhaps the first to take this route via their 2019 acquisition by Rigetti. Its connections have helped Rigetti extend its footprint outside of the US. However, the applications story did not feature prominently in Rigetti’s 2021 SPAC assisted listing . Cambridge Quantum’s merger with Honeywell’s spun-out Quantum Solutions division is another deal that brings together hardware and software. However, here the strategy seems clearly set on maintaining the momentum of the hardware agnostic software programme. Rakho was recently acquired by Odyssey Therapeutics, itself a recent precision medicine startup with a focus on cutting edge drug discovery methods. The wall of money currently available in biotech is giving such players the ability to pick the tools they want to develop as part of their own long-term journey. Odyssey has effectively in-housed a significant quantum algorithms capability. Qu&Co and Pasqal have also announced their merger. The fit is natural. Qu&Co have been notably sceptical of how soon existing NISQ approaches, such as VQE, can deliver the accuracy required for real quantum chemistry applications . Equally neutral atom hardware such as that from Pasqal opens up the possibility of interesting alternatives such as analogue quantum simulation. Pasqal will welcome top software talent looking at this relatively unexplored field.
Other software startups have different views on their funding and growth journies.
Algorithms to applications
A challenge for many players is how to build-out deep and highly specialised quantum algorithms expertise into capabilities that can be applied against a wide variety of potential real-world applications. A significant obstacle is that business insight is industry area specific, sometimes even geography specific, and often requires a lifetime’s study in its own right!
A key strategy is to combine algorithms smarts with enabling software and to engage with potential end-users on the challenge of identifying and building-out key application use-cases. The objective is to recycle learnings into services that can ultimately be productised. National quantum programmes are often keen to help. The challenge is how to avoid being dragged into a conventional consulting business model: scarce quantum algorithms expertise doesn’t scale easily, and for all the fun of conventional consulting, it doesn’t offer the same upside returns quantum VC backers are typically seeking. A common approach is to offer quantum opportunity assessment and pilot projects. The ideal is to see these develop into big-ticket retainer relationships with high-profile clients. Clients benefit from being seen to be active in an important coming area and from a kick-start to their own R&D activities.
QC Ware have an active algorithms research programme and leverage their Forge platform to both support efficient project execution, and the packaging of know-how into future service products. Notable features include components with broad applicability, such as data loaders and low circuit depth amplitude estimation . They scored a success in 2021 winning a BMW QC Challenge based on their hybrid optimisation approach . Most would agree their long-term promotion of the industry-leading Q2B event has proved a far-sighted success . Zapata is also noted for its algorithms research. Its Orquestra platform targets the workflow and data management issues that look likely to be common challenges in taking applications into deployment. Orquestra provides both authoring and operations support, and enables both public and private cloud solutions . Clients include top-5 companies in the chemical, energy & food & drink sectors. Fact Based Insight sees this tool as particularly strongly placed to help clients turn the expertise of individuals into organisational learning. 1Qbit has been an early mover in the quantum services sector, which it has built out by leveraging its 1QCloud optimisation platform. It led the way in emphasising quantum-inspired solutions as part of its offering to clients (something now copied by others) . It scored a notable success in 2021 winning a BMW QC Challenge based on their work in optimisation . Quantinuum (formerly Cambridge Quantum) have used the leading performance of its TKET compiler and the hardware agnostic capabilities it delivers, to build its presence across potential quantum application domains. It’s using this momentum to directly target long term collaborative relationships with major business and institutional partners. Software-as-a-Service – Quantinuum stands out as the provider with the first true quantum SaaS offering: the Quantum Origin cybersecurity key generation solution. We can expect this service to grow to address other cybersecurity application areas. The extent to which a service like this can generate consistent revenue growth will be a much-watched totem for the wider market.
A variation of this strategy is to focus more specifically on an application domain area of expertise. Such a strategy has the upside of honing in-house, industry-specific application/algorithm skills that could be a key future differentiator. This is a well-trodden path in the conventional software world. This may also provide an easier route to early revenue opportunities from quantum-inspired applications.
Multiverse have focussed on financial services. Their Singularity toolbox emphasises the ability to run in a security conscious banking environment, and to offer both quantum-inspired and quantum solutions. It’s a sign of client workflow awareness that it offers both python and excel frontend integration . Multiverse has been at the forefront of proof-of-concept application demonstrations in finance . Qu&Co have begun their development with a notable focus on quantum chemistry. Their QUBEC software platform is now in beta. A striking feature is integration with the Maestro chemical modelling interface of Schrödinger’s leading conventional quantum chemistry software package . Qu&Co also have the ability to widen this base, recently winning a BMW QC Challenge based on their pioneering quantum algorithms work on partial differential equations . Phasecraft have focussed on materials science as the area where quantum advantage can be delivered most quickly. Their emphasis is on nitty-gritty smarts (to the pulse level where necessary) to get algorithms to run at useful scale on NISQ devices. This takes its inspiration from the assembly language contortions required in the earliest days of conventional computing. It’s not that the team doesn’t have the capability to tackle other problems. It’s just that this is what they view as realistic to tackle first. They are benefiting from funding from UKRI on a project for battery materials design.
Many players are building specific experience into algorithm libraries. To drive adoption, these are typically being developed on an open source basis, however they still promise to be a way of building significant value-chain influence for their sponsors. In Fact Based Insight’s opinion, two stand out as examples of how activity in this area can be used to complement a company’s long term strategy.
PennyLane from Xanadu is a library offering whose influence has grown well beyond the confines of its parent’s own hardware. Initially conceived as a tool for quantum machine learning and offering a NumPy interface familiar to the machine learning community. This format has proven adaptable to support a wide variety of VQAs and so PennyLane has also found application in other potential NISQ application areas such as quantum chemistry. This has been a great way to counter any threat that its parent’s naturally distinctive software stack might become ‘detached’ from the wider community. Lambeq from Quantinuum seeks to open up a new ecosystem of quantum developers targeting QNLP applications; an area where quantum computing has intriguing and still relatively unexplored potential. This is an area where the Quantinuum team literally wrote the book . But why dominate a small pond when you can work with others on a hydroelectric scheme?
Making change happen
More broadly based consultancies are also set to play an important part in this market. Both technology specialists such as Reply (notable for winning the Airbus 2020 QC challenge), and more broadly based players: Accenture (also a BMW QC challenge winner ), Boston Consulting, Deloitte and McKinsey have all been notably active in building their quantum practices. Mainstream consultants have long experience in converting new ideas into business change. They also have the industry skills and in-country resources required to drive practical projects. It’s pertinent to recall that IBM is also a strong player in business services. As the focus of value creation builds in this segment, it is moving towards an area of IBM strength versus other tech majors. IBM has launched Quantum Accelerator to start to exploit this .
Better quantum tools
A complementary strategy is to provide the tools that quantum software developers themselves want to use, in the short, medium and long term. This type of pick-and-shovel play is also something that investors understand. However, the software community has a strong preference for open source tools. The challenge is how to design a business model that works with this.
One option is to seek to provide simply a better platform experience than that available elsewhere. This combines well with the opportunity to cultivate interest from quantum newcomers (and the future commercial ventures that can be seeded there), and with larger players who are really serious about avoiding lock-in with any single cloud vendor. Strangeworks has been pioneering this strategy for development platforms, but we are also seeing activity elsewhere in the stack.
Strangeworks QC (Quantum Computing) offers a hardware-agnostic development frontend that is, in Fact Based Insight’s opinion, the easiest for a newcomer to get up and running on. Crucially it also offers access to cutting edge tools such as the TKET compiler and IBM Qiskit Runtimes. The community library feature is a good learning tool (across a number of frameworks) and will appeal to those who understand the value that this mindset has brought to conventional software. Strangeworks EQ (Enterprise Quantum) adds backend access to a wide variety (and growing) list of quantum backends. The recently announced integration of Quantinuum’s Quantum Origin service is a natural fit here. Qapitan is a much more recent startup with interesting plans to build a quantum API marketplace. Currently in private beta, this promises an easy route for developers to deliver and commercialise SaaS offerings, while allowing end users to benchmark alternatives and upgrade as the market develops.
A medium-term challenge for businesses of this type is how to avoid clients moving elsewhere and disintermediating the service once they better understand their needs and the market matures. These platforms will need to focus on the hooks of real sustainable value they create.
Scaling-up algorithm authoring
Some companies are targeting tools that tackle the quantum algorithm design challenge. There are only three underlying quantum primitives known to give a quantum speedup. However, these can be combined into algorithms to address a variety of problem classes. These need to be further adapted for use in an even wider variety of business applications. To address problems of practical interest, these algorithms ultimately need to be implemented at the scale of many, many qubits. Most experts envisage quantum advantage requiring at least 100’s of qubits (either physical qubits with much higher fidelity than we enjoy today, or logical qubits after the application of quantum error correction).
Classiq seeks to get ahead of the issue of designing quantum circuits for devices with larger numbers of qubits. Just as we don’t programme conventional devices at the gate-level, Classiq simplifies the process by implementing a re-usable, modular block structure . The smart part is that it automates the process of optimising multiple individual blocks versus system-wide constraints, and allows the programmer to steer key trade-offs (such as overall qubit count or circuit depth). Output code is compatible with all major platforms. This may seem like a conceptual overhead when working with today’s small devices, but something like this will likely be essential in the future. Classiq believes its approach will make complex circuits easier to debug and maintain. This could be a key proof point. Horizon is pursuing a particularly audacious vision. It seeks to allow users simply to code in high level classical languages and then benefit from conventional or quantum execution without any special knowledge of quantum computing. A key insight is that opportunities for speedup come not just from abstract problem classes, but also from common place programme structures such as loops and array manipulations. Overall, Horizon envisage a chain of compilation that can be unpacked at multiple levels of detail. Many quantum algorithms face the challenge of efficient implementation of common functions from addition/subtraction to exponentiation. Horizon can already show proof-of-concept improvements in this area. Fact Based Insight thinks Horizon’s vison is the destination most users would like the community to get to. However, the full realisation of its promise will require tech such as FTQC, QRAM and faster quantum architectures to be delivered.
A short-term challenge for advanced authoring solutions is the limited capabilities of today’s quantum hardware. Getting ahead of the development game and exploring future resource requirements is a valid strategy, but client expectations on the timeline to real quantum execution will have to be managed.
Optimised low-level compilation
Lower in the stack, optimising quantum compilers have to deal with a series of additional uniquely quantum challenges (native gate set mapping, qubit placement and routing, circuit optimisation, error mitigation). Technically we’re often actually talking about a transpiling operation as multiple parts of the compiler chain each play their part. Three independent low-level quantum compilers stand out from the crowd. Each showcases unique aspects of the know-how that will be necessary to succeed in this market.
TKET – this flagship NISQ compiler is able to transpile efficiently between a uniquely wide range of quantum frameworks allowing it to provide unmatched hardware agnosticism. It backs this up with effective heuristic approaches for qubit placement and routing. There is a strong mathematical dimension to making a successful quantum optimising compiler, in TKET’s case this leverages concepts from the ZX-calculus formulation of quantum mechanics. More recently the addition of the Qermit module has streamlined support for common error mitigation protocols. TKET has been used in a wide variety of cutting edge research work. TKET is now open source, emphasising Quantinuums ambition to build its role at the heart of the quantum ecosystem . True-Q – Quantum Benchmark has a strong heritage in the characterisation and mitigation of quantum errors. True-Q features the randomized compiling technique it originally developed to overcome systematic control errors. This has now been augmented by the cycle benchmarking technique that targets fidelity in multi-qubit processes. True-Q scored a number of impressive citations in academic work in 2021 for error suppression and/or error diagnostics (including from Google Q AI, LBNL’s Advanced Quantum Testbed, NASA, ORNL, NCSU, and across frameworks including Qiskit, Cirq, & Quil, and with AQT’s trapped ion devices) . Quantum Benchmark was acquired in 2021 by Keysight Technologies. As we shall see this is part of a dynamic that promises to push a new range of possibilities back up the quantum stack. Fire Opal – Due to launch to end-users early in 2022, this is a new addition right at the base of the low-level compiler chain. It has shown impressive initial results . Q-Ctrl has a track record on ‘robust’ qubit control protocols at the pulse level. It has designed Fire Opal as a physical gate and pulse level compiler able to complement higher level optimising compilers, such as TKET. Over the last year the market has moved to Q-Ctrl’s advantage, with Rigetti and Quantum Machines joining IBM in opening up their hardware to third party pulse level control. Q-Ctrl offers integrations with each of these players. It is also now active in seeking to exploit its technology in quantum sensing applications .
As this segment continues to advance, the ability to transparently compare compiler performance will be increasingly important. Fact Based Insight would like to see more work referencing defined metrics, such as compiler influence on QV and CLOPS , and across the QED-C benchmark suite . Academic developments also remain very relevant in the compiler segment. An important theme has been how to leverage the conventional computer science techniques of formal methods. These seek a mathematically rigorous approach to verifying a program’s correctness. In the highly mathematical, but difficult-to-debug, world of quantum circuit transpilation these techniques may be even more relevant than in classical programming.
VOQC stands out not just for achieving performance comparable to leading compilers such as Qiskit or Tket, but in particular because its circuit optimisations are proved correct in the Coq proof assistant. PyZX is a notable research transpiler based out-and-out on the ZX-calculus.
Building-out low-level control
In conventional computing, low-level control of gate operations has long since migrated to the world of microarchitecture and firmware. However key challenges remain at this layer for the quantum stack. Qubits are fundamentally analogue systems compounding the challenge of noise and crosstalk, but also opening up possibilities for smart optimal control and calibration techniques. Quantum error correction is likely to need tightly synchronised readout and feedback across extended code patches. Some of the players here have a core base in specialist control electronics. This itself is good stuff. However, we need to understand that the ambitions of many of these players doesn’t rest just at the bottom of the stack. If achieving NISQ quantum advantage relies on highly optimised low-level performance, and particularly in a world where this pushes application specific quantum computers to the fore, such players can expect to shine. This is likely to create an opportunity for them to capture more value.
Quantum Machines has a strong global presence in providing control systems for commercial and academic quantum computing efforts. The company’s Quantum Orchestration Platform (QOP) is designed for scale and provides important capabilities such as dynamic mid-circuit measurement & feed-forward. The QOP tightly integrates classical and quantum processing with the QUA pulse-level language and the custom designed Pulse Processor, which brings classical processing all the way down to the real-time control hardware. This is designed to support a wide variety of quantum use cases across qubit platform types. Quantum Machines believes that this architecture lays the foundation for a full-scale architecture for heterogeneous quantum computing in HPC and cloud infrastructures. Zurich Instruments is an established scientific instruments provider. It’s core strength is built on its hardware performance (particularly lock-in amplifiers). However, as an early quantum mover, it has been quick to build-out a full, purpose built, QC control stack. This already supports multiple qubit types and low-latency measurement feedback. Zurich Instruments was recently acquired by Rohde & Schwarz, the electronic equipment group. Its strong presence in quantum control is a key part of this move and can be expected to be further accelerated by it. Qblox is a spin-out from the impressive QuTech ecosystem (and so benefits from direct experience with multiple qubit hardware types). It currently serves 25 academic and industrial labs. It provides a fully integrated control and readout modular solution with a focus on scalability. Success in the public tender for a 20Q solution for Chalmers Univ. was a notable win . Qblox core strength is the stability and time synchronisation of its hardware (SYNQ protocol) plus low-latency feedback/control flow (LINQ protocol). Its presence in the software stack is growing via its collaboration with OrangeQS, another QuTech spin-out: together they maintain the open source Quantify automated calibration and characterisation software . Keysight Technologies recently acquired Quantum Benchmark. This continues to strengthen its existing quantum-product suite, including hardware from Signadyne and software from Labber Quantum. Fact Based Insight expects these capabilities to be joined-up into a powerful offering at the base of the quantum stack.
Software startups are also being attracted to this segment.
Riverlane are developing Deltaflow.OS, supported by grant funding from UKRI. This aims to provide qubit hardware developers with a fast and scalable solution for key functions: low-latency & scalable control; automatic calibration & tuning; co-ordinating of runtime tasks between quantum and classical resources; error correction decoding. This leverages a distributed rather than hierarchical network of nodes. A pilot integration has been demonstrated with Artiq (a popular trapped ion control system). The QHAL hardware abstraction layer already promises compatibility with six flavours of hardware across four qubit technologies. AutoQT, another UKRI funded project, is bringing leading insights from the world of machine learning to this effort. In Fact Based Insight’s view, a crucial part of the appeal of Deltaflow.OS is that it doesn’t bake in one model of how the quantum stack should work. It can support, but does require, the conventional circuit model. It promises to focus the best available resources on the difficult decoding problem, but it leaves open exactly how this should be integrated. In a world where this R&D flexibility is needed to get quantum over-the-line, it could do very well.
In a wider context, the definition of OpenQASM 3.0 (still a ‘live spec’ ) explicitly builds a bridge between gate concepts and classical control concepts previously expressed at the architecture layer, and pulse & timing concepts normally expressed at the control layer. The quantum stack is set to evolve.
An important aspect of IBM’s early success has been its emphasis on making IBM Quantum an incredibly useful educational tool.
Qiskit & IBM Quantum Challenges – The Qiskit textbook and tutorial resources are widely recognised as great introductory resources for quantum computing. These have been supplemented by the successful IBM Quantum Challenge series. These part-tutorial, part-competition events have become a fixture of the quantum season and are a great way for individuals with programming interest to build their quantum skills. Black Opal – Q-Ctrl focus on the optimal control of qubits. Understanding why this is even a thing is at the heart of demystifying many aspects of the quantum stack. With commendable entrepreneurial flair, Q-Ctrl spotted that their internal visualisations are also a great way of teaching quantum newcomers about qubits and quantum computers in general. Fact Based Insight is extremely impressed by the results. The Black Opal online learning platform fills a massive gap in the market. This is not a popularised overview or an executive summary. This is an engaging way for the ‘technologically curious’ to get a sound understanding of the key concepts. It spans from basic physical principles like waves, all the way though to programming quantum algorithms with a custom interface and circuit visualization tool. Black Opal is an ideal starting point for any quantum newcomer, even if they then plan to progress further via a more advanced framework specific course. Quantum Network Explorer – QuTech’s previous foray into the educational space was Quantum Inspire. This continues to offer a great learning environment, but in terms of profile has suffered from playing catch-up in a segment already dominated by IBM Quantum. On the other hand, its Quantum Network Explorer is set to benefit from being first into a wider space: how to work with qubits over networks. This introduces its own new range of resources and concepts that many believe will one day form the basis of the Quantum Internet. SpinQ have their own highly innovative plans to turbo charge quantum education with their unique ‘desktop’ quantum computers based on NMR qubits. The 2Q Gemini and 3Q Triangulum allow students to learn quantum concepts with the added immediacy of running experiments on real desktop devices. SpinQ have already held a successful high school quantum computing competition based on their systems, reinforced with teaching materials for the classroom . Even more compact devices are planned to follow. Quantum Chess – Aleksander Kubica (AWS) defended his crown in the 2nd Q2B Quantum Chess Tournament, defeating opponents from Zapata, D-Wave, Nvidia, Google, Quantinuum, Horizon and QC Ware. Sometimes playing games is the best way to enthuse students.
Education is a social good. It’s also great strategy. It builds profile with exactly the individuals who will one day be driving the quantum revolution forward. It also opens the door to real revenue opportunities.
Current research themes – an elephant in the room
Much practical focus across the quantum software sector is currently on investigating error mitigations techniques that seek to help us run useful algorithms on limited NISQ devices. These are often incremental in nature and in many cases will ultimately be seamlessly integrated into the stack. However, some outstanding challenges could be more disruptive. Quantum error correction is a key quantum computing concept that is rapidly moving up the practical agenda. For many hardware players, its development is the central theme of their hardware roadmap. Others increasingly point to it not as an all-or-nothing overhead, but to how its carefully tailored application may get otherwise inaccessible algorithms over the quantum advantage line. So why aren’t we hearing more about its place in the software stack? Fact Based Insight’s simplified model nominally places it at the architecture layer. However, this is a convenient placeholder rather than an established consensus. Some players building up from the control layer will see it as natural territory for themselves. Some early algorithm players will want to closely supervise how it is used in early applications. For many the maths here will be a challenge.
QEC 2021 – Just as error correction is getting real and practical, it’s a real source of frustration that due to pandemic travel restrictions, the premier biennial conference on quantum error correction didn’t take place in 2021. Fact Based Insight hopes we don’t have to wait till 2023 for a replacement. Innovations such as the linear time union-find decoder , improved lattice surgery protocols , quantum LDPC codes , and fault tolerant 3D blocks illustrate that this area still has plenty to give.
Users will not ultimately care about where and how quantum error correction is implemented, just that the qubits they see are the quality they need. However, Fact Based Insight believes that investors in the quantum stack need to pay attention to this issue now. We may very well face a world where broad quantum advantage requires quantum error correction to be applied to at least to some degree. The circuit model can make physical qubit and logical qubit implementations look beguilingly similar. However, the transition could be much messier. Expect turbulence as the software market starts to fight over how the stack must adapt.
To watch in 2022
- Beyond classical – no cloud accessible quantum processor has yet demonstrated a ‘beyond classical’ computation. Will that change in 2022?
- Production use – which platform will be the first to host a quantum application in routine production use?
- Neutral atoms – expect to see the Pasqal software team probing the special aptitudes of these systems, such as analogue quantum simulation. Will ColdQuanta’s HILBERT cloud system encourage others to follow?
- Photonics – now that PsiQ has left stealth mode, will we hear more about how its FBQC will plug into the software stack?
- Cloud metrics – only IBM releases figures about usage on their quantum cloud. Will others follow suit? Watch out for signs that competition is hotting-up.
- Cloud benchmarks – QED-C produced great algorithm-based benchmarks in 2021. Will we see user-benchmarks that capture the end-to-end experience of running on quantum cloud platforms?
- Simulators – If you’re serious, you need a serious simulator. Watch out for optimised simulator performance as a differentiator at the top end.
- Business models – watch as companies carve-out the business model that is right for them. Some will be happy to be consultancies, but who can productise know-how?
- Breadth vs focus – will broad spectrum algorithms experts start to focus; will industry application experts start to diversify? Which strategy will best drive user engagement?
- SaaS – what interest will early offers such as Quantum Origin generate? How many more service offerings like this will we see in 2022?
- Compilers – Watch out as a combination of features, support and open-source tactics vie to drive take-up along the compiler chain. Will standard metrics and benchmarks be able to bring clarity?
- Quantum OS – will low-level control systems show further evidence that they can break-out of their academic heartland to emerge as truly independent operation systems for commercial quantum computers?
- Engagement & education – IBM Quantum still retains a big lead in user engagement. Will we see signs that others are catching-up?
- Quantum Chess – Will Aleksander Kubica retain the world chess crown for a 3rd time? Will we see a commercial release of Quantum Chess 2.0?
- Quantum error correction – where will it fit in the stack?
Fact Based Insight would like to thank the many companies that have fed back on their entries in this briefing. In addition, we would like to thank Tom Lubinski (Chief Software Architect at Quantum Circuits and Chairman of the QED-C Committee on Quantum Computing Standards and Benchmarks) for his many insightful views and suggestions.