Energy efficiency and operating costs for systems are as important as raw performance in today’s datacenters. Everyone from the largest hyperscalers and high performance computing centers to large enterprises that are sometimes like them are trying squeeze as much performance as they can from their infrastructure while reining in power consumption and the costs associated with keeping it all from overheating.

Throw in the slowing down of Moore’s Law and new emerging workloads like data analytics and machine learning, and the challenge to these organizations becomes apparent.

In response, organizations on the cutting edge have embraced accelerators like GPUs and FPGAs to push the performance of systems without simply revving up the speed, investigated new memory technologies – like conductive bridge RAM (CBRAM), resistive RAM (ReRAM), and phase change memory (PCM) – as well as software-defined and hyperconverged infrastructures. They have also built datacenters in milder climates regions and used such cooling techniques as outside air to drive down costs and allow them to run their datacenters at slightly higher temperatures than usual.

All this work comes as the industry moves closer to exascale computing and continues to pursue such advanced techniques such as quantum computing, driving the need to find new technologies and techniques to accelerate performance beyond traditional methods.

There are a lot of levers that organizations and vendors are pulling to help drive performance, but one area that is being overlooked is extremely low temperature computing, Gary Bronner, vice president of Rambus Labs, tells The Next Platform. In particular, driving down the temperature of systems and their components to cryogenic levels that are below minus 292 degrees Fahrenheit, minus 180 Celsius or 93.15 Kelvin – depending on your preferred thermometer. The idea behind cryogenic computing is that running systems at such absurdly low temperatures can improve everything from performance and energy efficiency to density and scalability.

Current HPC systems run in the range of 80 degrees Fahrenheit, or 27 degrees Celsius or 300 degrees Kelvin.

“Something not many people are talking about is playing the temperature card,” Bronner says, noting that right now most organizations are looking for ways to run their datacenters at higher temperatures. Instead, they could “move the knob a lot in the other direction. To me, I like this a lot because the end of Moore’s Law is real and I think the industry needs something significantly different. All the talk I hear is about stretching Moore’s Law in the datacenter, such as using GPU accelerators. Those will work, but it’s a one-time shot. You use it once and you are done.”

Turning to something as radical as cryogenic computing and using techniques such as Josephson junctures for processing will open up pathways for leaps in innovation similar to those seen with Moore’s Law in the past, and could lead to quantum computing and other new technologies, Bronner says. Cryogenic computing is usually tied to quantum computers, which will need to run at extremely low temperatures to operate effectively, but Bronner explains that such temperatures will drive significant improvements in more traditional system architectures as well.

Rambus and Microsoft are planning to put this idea into action. The companies in 2015 announced a partnership to research memory technologies needed for quantum computing. At the time, neither vendor mentioned cryogenics, though the goal was to develop high-bandwidth, power-efficient memory architectures. Officials with both companies late last month said they will develop a prototype of cryogenic memory based on DRAM within weeks, with further development being unveiled later this year. It is part of Microsoft’s larger efforts around quantum computing through the Station Q consortium that the software giant heads.

The idea of cryogenic memory isn’t new. There was research done as long as 25 years ago, and such vendors as IBM – where Bronner spent more than 20 years working on memory technology – took a look at it. It fell off the radar, but with growing concerns around performance, power efficiency and processor development and the drive toward exascale and quantum computing, interest has been rekindled. As The Next Platform has reported, the U.S. Intelligence Advanced Research Projects Activity agency (IARPA) is looking at the work being done by the three-year-old Cryogenic Computing Complexity (C3) program as part of its research into efficient exascale systems.

The memory subsystems that Rambus and Microsoft are working on will be able to operate at 292 degrees Fahrenheit, which drives down power consumption while improving performance, Bronner says. On the company’s website, officials said the goal is to develop memory subsystems that can operate at 77 degrees Kelvin and work with computers operating at 4 degrees Kelvin, or at the temperature of liquid helium. That said, the temperatures of the memory subsystems needed for quantum computing will have to drop further, to as far as 20 millikelvins or less – colder than deep space.

The reason is that quantum bits, or qubits, need to be stabilized. Current computers use bits that are in states of either 0 or 1. Quantum computers use qubits, which can be 0 and 1 at the same time. It’s that capability that will enable quantum computers to quickly resolve complex algorithms and run computations that modern systems just can’t. A key to quantum computing is the coherence time of the qubits – how long the fragile quantum state survives. For computing, it is important for the quantum state to survive long enough to perform its task. Cryogenic computing temperatures can help enable that, Bronner says.

“You want the qubits to stay coherent together long enough to do their calculation, and to do that it has to be at really low temperatures,” he says, with 20 millikelvins being the thermal ceiling.

There is a lot of work ongoing to develop new memory technologies that can be used in next-generation extreme-scale systems and quantum computers. Hyperscalers like Google are doing such research, as are Microsoft and Rambus. Hewlett Packard Enterprise (HPE) will use memristor technology for The Machine, the massive high performance system that the company has been working on for several years. Bronner says that for the prototype, Rambus and Microsoft are using essentially off-the-shelf DRAM memory. Researchers were able to drop the temperature of the DRAM to 77 degrees Kelvin and work with it, so they are using it as a starting point.

What commercialized cryogenic memory might look like and how it will work in a system remains unclear. Pools of cryogenic memory subsystems may be linked via high-speed interconnects to systems housed in rooms of a different temperature. But in order to get to true quantum computing, the temperatures for the systems and their components are going to have to drop, Bronner says. Cooling systems and facilities to such low temperatures will cost a lot of energy and money, but the improvements in performance/watt make it a plus net gain, he said. By dropping the temperature of the system to 4 degrees Kelvin, there is a four-fold improvement in performance/watt over current technologies. Adding in the cost of cooling, the performance/watt is still improved by 100 percent, he says.

Currently, D-Wave is the only vendor that offers a commercially available quantum computer. The D-Wave 2000Q has 2,000 qubits, although some have questioned whether the company’s use of a form of computation called “quantum annealing” is true quantum computing. IBM and other companies, such as Google, also are pursuing quantum computing. IBM last year made quantum computing capabilities available on the IBM Cloud, and in March unveiled IBM Q, a unit with the goal of developing universal quantum computers for both commercial and scientific uses. In addition, NASA is working with tech vendors in private-public groups researching quantum computing.