On Monday, Google researcher Julian Kelly unveiled Bristlecone, the company’s new record-breaking 72-qubit quantum processor, at the annual meeting of the American Physical Society in Los Angeles. Kelly and his colleagues at Google’s Quantum AI lab hope this processor will be the first chip to achieve quantum supremacy, the point at which quantum computers can perform calculations that are beyond the capabilities of even the most advanced supercomputers.

“It’s a huge jump in the number of qubits on a chip, and they are arranged in a 2 dimensional layout, which complicates controlling the system,” Michele Mosca, a physicist at the University of Waterloo’s Institute for Quantum Computing, told me in an email. (He wasn’t involved with the Google processor.) “It’s much closer to what is needed to implement the surface code,” Mosca added, referring to the ability to manipulate qubits in a quantum system to perform useful calculations.

The term ‘quantum supremacy’ is a bit loaded insofar as it only refers to the ability of a quantum computer to outperform classical supercomputers on some types of equations. Moreover, a metric for measuring the performance of a quantum processor to determine if quantum supremacy has been achieved is also a point of contention among quantum physicists. Kelly and his colleagues at Google believe that their new processor will not only demonstrate quantum supremacy, but that they also have the tools to measure it when it happens.

According to a blog post by Kelly, the processor—nicknamed Bristlecone because the array resembles the pattern on a pinecone—is based on a nine-qubit array developed by Google researchers.

Qubits are the quantum analog of the digital bit, the smallest unit of information processed by traditional computers. Yet unlike a bit, which is binary and can only have one of two values (one or zero), a qubit can exist in a superposition of states (its value could be zero, one, or some combination). Qubits will endow quantum computers with the ability to do certain tasks—such as querying a database, factoring large prime numbers, or creating complex scientific models—far more efficiently and accurately than a supercomputer.

But creating a working large-scale quantum computer isn’t just a matter of stringing a bunch of qubits together and letting them do their thing. Creating qubit arrays is a challenging task that generally requires exotic materials, expensive laser set-ups, and/or extreme environmental conditions to operate properly, depending on whether the qubits are based on ions, semiconductor spins, or as in the case of the Google processor, superconducting circuits.

Another difficulty is that qubits themselves are quite sensitive to noise—environmental interference that can throw a qubit’s state out of whack—and as a result are highly error-prone. Creating robust qubit arrays with minimal error rates is one of the biggest hurdles standing between physicists and a functioning large-scale quantum computer.

The nine-qubit array that served as the model for Bristlecone was a significant step in that direction. As detailed in two papers published by Google researchers in 2015 and 2016, this qubit array was able to achieve 1 percent error rates for readout, 0.1 percent error rates for single-qubit gates, and 0.6 percent for two-qubit gates. A qubit gate is the quantum analog of the logic gate in a classical computer, which usually takes two inputs and performs a logical operation on them to produce a single output.

In the world of quantum processors, the nine-qubit processor’s error rates are remarkably low, especially for the two-qubit gates that the Google researchers say is a critical component of their Bristlecone quantum processor.

“We are looking to achieve similar performance to the best error rates of the 9-qubit device, but now across all 72 qubits of Bristlecone,” Kelly wrote. “We believe Bristlecone would then be a compelling proof-of-principle for building larger scale quantum computers.”

It’s important to note that the Bristlecone processor isn’t on the cusp of quantum supremacy just yet. Before it can get there, Kelly and his colleagues must first devise a way of measuring the processor’s performance.

The problem is that when it comes to quantum supremacy, a supercomputer won’t actually be able to check the results from a quantum computer since the calculations will be too complex for it to handle. Quantum researchers have developed a number of different methods for checking the results of quantum computers with classical methods, although these often only look at specific aspects of a quantum processor rather than how the processor performs as a whole. This can make it difficult to assess whether or not the quantum processor is actually outperforming a classical computer.

Kelly wrote that theorists at Google have developed a benchmarking tool that will be able to determine if their chip has achieved quantum supremacy. The test involves applying random individual quantum circuits to the processor and measuring the output against a classical simulation.

A quantum processor capable of outperforming a classical computer must not only have a lot of qubits; those qubits must be capable of carrying out complex operations, measured as a quantum circuit’s “depth,” which conveys how many operations a quantum processor can be expected to perform before the output is incorrect.

Each qubit in a processor is capable of introducing error into a computation, which makes it hard to strike a balance between low error-rates and the ‘power’ of a quantum processor. A system with a small number of qubits might have almost no errors (that is, a large circuit depth), but the complexity of the calculations that the qubit array is able to carry out won’t hold a candle what a classical supercomputer can do.

The question is: How many qubits are needed, and how ‘deep’ must a quantum circuit be, to achieve quantum supremacy? This is still up for debate, but Kelly wrote that Google theorists think they have an answer.

“Although no one has achieved this goal yet, we calculate quantum supremacy can be comfortably demonstrated with 49 qubits, a circuit depth exceeding 40, and a two-qubit error below 0.5 percent,” Kelly wrote in the blog post.

The Google researchers have already developed a processor with 72 qubits, so now they have to increase their circuit depth and lower their two-qubit error rate even further to achieve quantum supremacy.

Yet as Mosca pointed out, not only is benchmarking quantum supremacy difficult, but the term “quantum supremacy” itself is contentious.

“We should be vigilant in making it clear that we are talking about demonstrating a quantum advantage for some problems, and not all problems,” Mosca told me. “Demonstrating it for some problems is validating the prediction that quantum mechanics enables fundamentally more powerful computation.”