In the US pharmaceutical sector alone, if quantum simulations of complex atomic processes were available today, and 10% of companies were willing to pay for the capability, quantum computing would represent a $15 billion to $30 billion addressable market opportunity. Currently, the market for all high-performance computing globally is $10 billion.

There are other practical applications. Quantum computing can be applied to accelerate search and machine learning algorithms used with exponentially growing datasets. This will become increasingly important to unlocking the value of data, as the tens of billions of devices in the Internet of Things, for example, drive the volume of available data into the stratosphere.

For some classes of problems, the search for a solution requires trial and error and the simultaneous testing of potential solutions. Imagine an archipelago of thousands of islands connected by bridges and the need to find a path that crosses each island only once. The number of possible solutions rises exponentially with the number of islands, but checking that a given path satisfies the constraint of sole island visits is straightforward. If our hypothetical island puzzle had 1 million possible solutions, a binary computer would require an average of 500,000 tries to find the right one. A quantum processor running Grover’s algorithm would solve the problem in only 1,000 attempts—500 times faster.

Where a binary computer would need 500,000 tries to find the right solution, a quantum processor would need only 1,000.

This is equivalent to the type of problem faced by search algorithms and the large, multilayer neural networks that underlie machine learning. For neural networks to handle such tasks as object detection and identification—determining whether the object that suddenly appears in front of an autonomous car is a wind-blown plastic bag or a baby carriage, for example—they need to be trained on large data sets and a large number of outcomes through trial and error and supervised learning. While machine learning and artificial intelligence have become a reality through the combination of large data sets and parallel, low-cost GPUs, quantum computers can accelerate the training of neural networks and increase the amount of data they can handle. This application is an active field of research as scientists and engineers try to identify quantum algorithms that can be harnessed for machine learning. As more algorithms are discovered, the fundamental advantages of quantum over classical computers could lead to the displacement of the $20 billion market for high-performance machine-learning computing by 2030.

The Technology Today…

Quantum computing’s power comes from the fact that it is a fundamentally distinct technology from the binary, Boolean logic–based computing we are all used to. There are three essential differences. The first has to do with the bits. Binary computers use binary bits: everything is based on 1s and 0s or, as some like to think about it, on or off. Picture a light switch, which has only two positions. Qubits, on the other hand, can inhabit states of 1 and 0, or on and off, at the same time. The light switch is instead a dimmer with a theoretically infinite number of settings. Qubits are about probabilities rather than black-or-white certainties, and this is simultaneously a big enabler and a substantial problem (more on this below).

The second difference is that binary computers keep all those 1s and 0s separate; they have to in order to run their calculations. Quantum computing works on the purposeful entanglement of qubits; by manipulating one, you simultaneously manipulate all of its entangled mates. Adjusting one light dimmer affects all the others in the room—and all the others in the house. This is what gives quantum computing its calculating prowess.

The third difference lies in the way that quantum computers do their work. While binary computers conduct massive numbers of arithmetic calculations sequentially, a quantum computer calculates all possible outcomes concurrently and settles on a potentially correct answer through constructive interference; it “cancels out” all the wrong answers. In the example of the bridges connecting the islands, a quantum computer would simultaneously consider all potential routes and settle on one that is probabilistically “correct.”

From a practical engineering standpoint, quantum computers have real constraints. Quantum circuits work best at very low temperatures—near absolute zero. Quantum states are highly unstable; any outside influence increases the chance of error, which is why they need to be super-cooled and isolated. Qubit stability, or coherence, and error correction are major issues—indeed, as machines get big enough to do useful simulations, the ratio of physical qubits (required for control and correction) to the qubits doing the actual work can be as high as three thousand to one. For these reasons, quantum computers require significant surrounding infrastructure and resemble old-style mainframes in large, climate-controlled data centers (just a lot colder!) much more than they do today’s laptops or smartphones.

…And Tomorrow

Today’s quantum computers are in the very early stages of invention, not unlike classical computing in the early 1950s, when William Shockley of Bell Labs invented the silicon-based solid-state transistor that replaced the vacuum tubes powering the earliest computers and set the tech industry off on the pursuit of ever-more minute and powerful processors that continues to this day.

For two quantum technologies, trapped ion and superconductor, commercial applications are in sight.

Several quantum technologies are racing to reach useful qubit thresholds. Two have made sufficient progress for commercial application to be in sight: trapped ion and superconductor. Trapped ion is widely viewed to produce the highest-quality qubits (those having the lowest inherent error rate) and therefore currently has an advantage over superconductor, both in time to market for key applications and capital cost. At the end of 2017, researchers working on trapped-ion machines successfully entangled 14 qubits to perform a designated operation with a logical success rate of 99.9%. The comparable numbers for superconductor are 9 qubits and 99.4%. If each technology followed a development scenario (without improvement to error correction) according to Moore’s law, trapped ion would reach the threshold of 150 logical qubits necessary for major quantum simulation applications first, but not until around 2040.

That said, the need for error correction is the biggest driver of resource requirements and has an outsized impact on scale and cost. Significant reduction in error correction could accelerate trapped ion toward key thresholds in scale and cost reduction much sooner, perhaps as early as 2028 to 2030. Microsoft is pursuing a quantum computing technology with a potential one-to-one ratio of physical to logical qubits, but no working prototype has yet been produced. In the short term, we believe trapped ion is well positioned to be first to market, but it still has many of the risks inherent in early-stage technologies.

Once technical feasibility is established, we expect to see S-curve adoption patterns, similar to those of other advanced technologies. Adoption for each application will depend on the degree of the advantage conferred by quantum processing and the maturity of the algorithms directing the problem solving. More specifically, given that quantum computing can operate in the mode of platform-as-a-service, applications in which there is a significant speed advantage could see rapid adoption, on the order of 70% penetration within five years, similar to the adoption rate of GPUs in machine learning applications. Applications that offer a moderate speed advantage could take up to 15 years to reach 50% penetration (the development of software as a service is a useful analogy), while applications with unknown algorithms and potential will almost surely follow slower adoption curves, with quantum computing augmenting binary processing in 25% or fewer cases even after 15 years.

Overall, we project a substantial market for quantum computing, but the timing could vary widely depending on when the critical technical milestones are reached that unlock actual business-applicable computing capacity. In a “base-case” scenario (assuming a Moore’s law speed of qubit development with no improvement on error correction), the market for quantum applications would reach about $2 billion in 2035, then soar to more than $260 billion by 2050 as adoption picks up. An “upside” case, in which there is significant reduction in the need for error correction, would see a substantial market develop much sooner: about $60 billion in 2035, growing to $295 billion by 2050 (compared with an $800 billion global commercial and consumer computing market today). (See “The Quantum Stack and Its Business Models” and Exhibit 3.)