The fastest supercomputers are built with the fastest microprocessor chips, which in turn are built upon the fastest switching technology. But, even the best semiconductors are reaching their limits as more is demanded of them. In the closing months of this year, came news of several developments that could break through silicon’s performance barrier and herald an age of smaller, faster, lower-power chips. It is possible that they could be commercially viable in the next few years.

In December, Google and Nasa announced that for problems involving nearly 1,000 binary variables, ‘quantum annealing’ significantly outperforms a classical computer – more than 108 times faster than simulated annealing running on a single core computer. The researchers think they’ve found a quantum algorithm that solves certain problems 100 million times faster than conventional processes on a PC.

In the journal Science, published in October, IBM researchers announced they had made the first carbon nanotube transistors that don’t suffer from reduced performance when made reduced in size, thus making the scaling down of chips easier. Another team published in Nature that they had created a quantum logic gate in silicon for the first time, making calculations between two quantum bits of information possible, and a silicon-based quantum computer an achievable reality.

Both results represent milestone scientific achievements and are highly complementary, said Möttönen Mikko, leader of quantum computing and the devices lab at Aalto University, Finland, and professor in quantum computing at the University of Jyväskylä. Mikko was not involved in either research project, and so is in a position to be an impartial commentator.

Beyond silicon?

In the search for speedier processors, material scientists are looking for ways to improve upon Complementary Metal Oxide Semiconductor (CMOS) technology. Silicon-based chip performance will eventually bottleneck, in part due to overheating, as they are shrunk to their physical limits.

With the best of today’s technology, a 100 Petaflop machine that runs at 30 per cent computational efficiency would have the same compute power, but about one tenth the energy cost of a proposed Exascale machine, as reported in The new realism: software runs slowly on supercomputers (SCW August/September 2015 page 20).

In July 2015, US President Barack Obama signed an executive order, to encourage faster development of the first Exaflop supercomputer, called the National Strategic Computing Initiative. But without innovation, the power requirements of the first Exascale supercomputer – to find new medicines or solve climate change simulations – become astronomical both in cost and real terms.

Today the most efficient system needs about one to two Megawatts per Petaflop. One estimate has an Exascale computer sucking up 40 Megawatts – enough power for a small town of 50,000 people.

Commercial enterprises are investing a lot in innovation. IBM Research in the US plans to replace traditional silicon by investing $3 billion in chip R&D technologies, which includes carbon nanotubes. These are single atomic sheets of carbon rolled up into a tube. Electrons in carbon transistors can move better than in silicon.

Carbon nanotubes

This October, a team of IBM scientists created a new way to shrink transistor ‘contacts’ of carbon nanotube devices, without reducing performance. This was done with a microscopic welding technique that chemically binds metal atoms to the carbon atoms at the ends of nanotubes.

With this, contact resistance challenges could be overcome down to a 1.8 nanometre node. This means carbon nanotube-based semiconductors will result in smaller chips with greater performance and lower power consumption.

Contacts inside a chip are valves that control the flow of electrons from metal into the semiconductor’s channels. As transistors shrink, electrical resistance increases within the contacts, impeding performance. Some estimates are that the performance of carbon nanotubes will be five to 10 times better than silicon circuits.

But the death of silicon has been predicted many times in the past. Gallium arsenide, for example, was once touted as a better replacement. But silicon is abundant and cheap to process. In addition, a silicon crystal has a very stable structure, can be grown to very large diameter boules and processed with very good yields. It is also a fairly good thermal conductor, thus enabling very dense packing of transistors that need to get rid of their heat of operation. Finally, there is a vast amount of production plants already installed and dedicated to making processors out of silicon, yielding huge economies of scale to the silicon industry. ‘For this new technology to become commercially viable, it has to beat the current transistor, the development of which has been given decades and billions – if not trillions – of euros,’ said Mikko.

Quantum computing

So, instead of some exotic material such as carbon nanotubes, an alternative path could be radical innovation in silicon technology. In Australia, researchers have created the first two-quantum bit (qubit) logic gate within silicon, which may unlock scalable quantum computers sooner.

Principal investigator professor Andrew Dzurak, based at the University of New South Wales in Australia (UNSW), and his team found that qubits were able to influence each other directly, as a logic gate, when performing calculations using the mechanics of subatomic particles.

Like a compass, the magnetic field of an electron dictates the binary code of ‘0’ or ‘1’. In a quantum system, particles can exist in two states simultaneously too – a superposition. A two-qubit system can perform simultaneous operations on four values, and a three-qubit system on eight values, etc.

The team morphed their silicon transistors into quantum bits by ensuring that each one had only one electron associated with it. Then they stored the binary code on the spin of the electron.

‘These two research directions have rather different strategies,’ said Benjamin Huard a CNRS researcher heading the quantum electronics group at the Ecole Normale Supérieure of Paris, France. Huard, too, is in a position to act as impartial commentator. ‘The UNSW team… shows that spins in silicon constitute promising candidates. Comparatively, the IBM discovery is more incremental, since it can readily be applied to usual computers if the technology is pushed to its limits.’

Time for development

However, it may take at least a decade before a commercial qubit chip could be ready, even if all goes well. ‘We are aiming to have a prototype chip that demonstrates the manufacturing pathway ready in five years. I think it will be very challenging to have a commercially available processor chip ready within 10 years,’ said Dzurak. The Australian team has just patented its design for a full-scale quantum computer chip of millions of qubits. The engineering programme to scale this technology from chip to a supercomputer-scale system has just begun. ‘If we could do it in less than 15 years, I’d be a very happy man. I think most experts in the field would agree with my assessment,’ said Dzurak.

Back in 1998, researcher Bruce Kane first proposed the idea of a silicon-based quantum computer in a Nature paper. In theory, a quantum computer with just 300 quantum qubits could hold 2 to the power of 300 values simultaneously – which is around the number of atoms in the known universe – performing an incredible quantity of calculations at once.

In reality, qubits are prone to errors; you need lots of extra bits or ‘ancilla’ bits, which have a secondary error-correction role in a logic circuit. The actual number of physical qubits for equivalent and, most importantly, accurate computational power could add up to millions when scaled up to silicon semiconductor technology.

IBM scientists recently made a new type of chip that for the first time was able to detect and measure both kinds of quantum errors – bit-flip and phase-flip – simultaneously.

‘There are other qubits in the lattice that serve as the data or code qubits, and hold the quantum information. These data or code qubits get checked by the ancillas,’ said Jerry Chow, manager of IBM Research’s Experimental Quantum Computing Group.

Quantum decoherence are errors in calculations caused by interference from many factors. These errors are especially acute in quantum machines.

‘We do believe we have a promising path forward for scalability... Systems of 50-100 qubits we expect to be possible within the next five years,’ said Chow.

Commercial quantum computers

To date, the Canada-based D-Wave system is the only commercially available quantum computer of its type on the market. In 2011 a D-Wave quantum computer was sold to the company Lockheed Martin, and in 2013 a 500-qubit D-Wave Two system was installed at Nasa Ames, where researchers from Google, Nasa, and the Universities Space Research Association (USRA) have been using it to explore the potential for quantum computing. This year, the US Los Alamos National Laboratory purchased one.

The computer’s processors use a particular process called quantum annealing to exploit quantum mechanical effects, such as tunnelling and entanglement. In December, research by the team at Nasa Ames showed that quantum annealing significantly outperformed a classical computer for problems involving nearly 1,000 variables. The team thinks it’s found a quantum algorithm that solves certain problems 100 million times faster than conventional processes on a PC.

Despite this progress, doubts remain. ‘I do not rule out a quantum-annealing design, but it is not clear if such a technology will really scale in the way it needs to, in order to overtake conventional processors,’ said Dzurak.

Although technically impressive, the D-Wave is not faster than classical computers. ‘It is not clear if the current D-Wave computers are truly quantum computers. There is no evidence that they are faster than classical computers,’ said Dr Menno Veldhorst, a UNSW research fellow and lead author of the two-qubit paper.

Future developments include chips directly interfacing with other components using light, rather than electrical signals. ‘One problem with photon-based quantum computers (QCs) is that there are a lot of overheads to make the chip function,’ said Dzurak. ‘I wouldn’t rule it out. There is a lot of interesting work on photonic-based QCs. If I had to place a bet, I would say the first commercial system will either be a silicon-based QC or a superconductor-based QC.’

Quantum dots

Veldhorst also thinks large-scale architectures will likely come from silicon-based quantum-dot qubits and superconducting qubits – something Professor John Martinis’ research group at the University of California Santa Barbara and Google is currently working on.

A quantum dot breakthrough was recently achieved by a team of physicists at the Technical University of Munich, Germany, and the Los Alamos National Laboratory and Stanford University in the US. They produced a system of a single electron trapped in a semiconductor nanostructure, with the electron’s spin used as the data carrier.

The team found data-loss problems caused by strains in the semiconductor material, but these were solved when an external magnetic field with the strength of a strong permanent magnet was applied. This system of quantum dots (nanometre-scale hills) was made of semiconductor materials that are compatible with standard manufacturing processes. ‘A large-scale quantum computer will take another decade or two,’ said Veldhorst.