While quantum computers already exist, such as IBM's Quantum Experience, they are still only capable of relatively simple calculations. For true, usable, quantum computers to exist, there are many technological challenges to overcome, even though some roadblocks like data transfer and using voltage in lieu of lasers in quantum systems have already been tackled. Researchers at the University of Sydney have now blasted through another quantum quandary, potentially bringing stability to the notoriously unstable world that exists in these systems. New Atlas spoke with one of the researchers to get more information about the potentially game-changing work.

Quantum computers make calculations through atomic particles known as qubits, which are basically trapped atoms. Unlike bits of information in standard computers which can represent either a 1 or a 0, qubits are able to represent both states at once, making quantum computers capable of carrying out simultaneous calculations and increasing processing speeds to dizzying levels. However, one problem with these systems is that qubits are inherently unstable and quantum computing systems are subject to degradation known as decoherence.

"Much the way the individual components in mobile phones will eventually fail, so too do quantum systems," said U. Sydney Professor Michael J. Biercuk, in a statement. "But in quantum technology the lifetime is generally measured in fractions of a second, rather than years."

So the researchers decided that if they could just predict the way in which the system was going to disintegrate, they could proactively prevent it. But there's another problem with quantum computing systems: When you observe them, you change them.

Biercuk explained to New Atlas:

One major challenge in quantum physics is that the "quantumness" of a system – that is its ability to demonstrate the exotic properties such as superposition, being in more than one state at the same time – is lost once the system is observed. We say that measuring a quantum system destroys quantum information. Accordingly, measurements can generally be used only rarely in quantum systems - they are a "costly resource."

The challenge here is that most engineering techniques aiming to achieve stability in some technology rely on measurement: Think about cruise control in your car, which measures your speed relative to a setpoint and then adjusts to keep it constant. It needs frequent measurements in order to work well.

Because measurement destroys quantumness, our objective was to find a way to keep the qubits stable with minimal measurements. Instead of simply measuring frequently/constantly we measured infrequently and then predicted how the qubits would change in the future between measurements.

To do this, he and his team employed machine learning and found that they were able to accurately predict exactly how the atoms would degrade so that they could counter the behavior. This, in effect, let them predict the system's behavior without directly observing it, and then take steps to stabilize the system.

"By taking just a small number of measurements on our system, a machine learning program was able to extract information about how the qubits were randomly changing in time due to the environment, Biercuk explained to us. "Figuring out that information allowed the algorithm to then predict how the qubits would randomly change in the future. This action is called 'prediction' and is a bit similar to the way machine learning algorithms can predict future consumer behavior from past purchasing history.

"We calculate these predictions on the fly as we take new measurements about the qubits, and then preemptively apply 'corrections' to stabilize the qubits against what changes we think will occur," he added. "The net result is that we can preemptively correct for changes that randomize the qubit before they occur."

The prediction and stabilization technique now joins other advances in quantum computing including making qubits from silicon and introducing bridging to quantum systems to bring the day when we have blindingly fast processors at our disposal one step closer.

The work on the breakthrough has been published in the journal Nature Communications.

Source: The University of Sydney