Recently, quantum computing has been heralded as the new cool kid on the block. The point of quantum computing is that, during a calculation, the bits (called qubits) that are being manipulated are never in a definite one or zero state. Instead, they can be thought of as being both a one and a zero simultaneously, which allows a quantum computer to explore many solutions at the same time. The upshot is that, for a limited set of problems, quantum computers may offer a substantial speed up over normal computers. In recent, unpublished research, scientists have made use of the similarities between a certain type of quantum computation and neural networks to construct a very simple quantum neural network. The result may offer a faster and more robust form of pattern recognition.

To understand what the researchers have done, we are going to have to take a step back and take a look at a particular form of quantum computing, called adiabatic quantum computing, and compare that to a specific type of neural network. In normal quantum computing, the qubit values might be encoded in the states of a bunch of atoms. These are then individually manipulated by direct operations on their states to perform the calculation—the answer is obtained by measuring the final state of the atoms.

In adiabatic quantum computing, the qubits are still encoded in atomic states but the problem is encoded in the environment of the atoms. The combination of states with the least energy (called the ground state) for that environment is the answer to the problem. It is usually pretty easy to encode a problem in the environment, but getting the atoms to find their lowest energy state is more problematic, so the approach has to be modified slightly. The initial quantum computing environment is set so that it's simple to put the qubits in their ground state. Then, very slowly, the environment is smoothly modified to the one that encodes the problem. If this is done without exciting the atoms, the answer to the problem can be obtained by reading out their states.

Neural networks use a similar principle, where a memory is encoded across a series of "neurons" as the lowest-cost stable state. In this case, a memory is encoded by putting it into the neural network's inputs. The network's internal state evolves in response to the input, taking on a unique configuration. If one equates the cost property of neural networks with the energy state of the atomic qubits, one can see a number of similarities between neural networks and adiabatic quantum computing.

These similarities were exploited for pattern recognition purposed by researchers in Munich, using a liquid state nuclear magnetic resonance spectrometer (think MRI scanner). Two qubits were encoded in the nuclear magnetic moments of hydrogen and carbon, which can be linked up to form a very simple neural network called a flip-flop. The researchers encoded a series of patterns in the neural network—something that is not possible in a classical neural network this small.

The neural network was able to recognize those patterns again if they were presented as input. That is, if a one-zero state was stored along with a one-one and a zero-zero state, the neural network would give one response to seeing a one-zero input—a pattern that was stored previously—while a zero-one input would generate a different response. Note that the classical version of this circuit could store exactly one two bit pattern, while the quantum version can store all four bit patterns.

As is typical with quantum computing experiments, this is small-scale stuff. Will it scale up? The researchers seem to think so, since that is what they are trying to do now. If it does scale, we will be looking at a huge boost in the abilities of computers to recognize images and sounds. And we can all buy an extra basement for the nuclear magnetic resonance spectrometer PCI card.