With the exponential increase in the number of data centers and electronic devices over the last decade, waste heat has become a big but overlooked environmental problem. It is frequently hidden away, out of sight in data centers and server farms in remote locations around the world, but its environmental cost is very real.

A strong driver of the growth in energy usage is our dependence on software and the ever-increasing amount of hardware needed to support it. Waste heat may only get worse as new technology enters our lives. While a handful of companies have started to reduce or recycle waste heat in various ways, existing mitigation methods are typically inefficient and do little to curb the rise in waste-heat-related pollution. Now, though, a new generation of water-cooled GPU systems is coming online and promising to revolutionize the energy efficiency of everything from artificial intelligence to cryptocurrency mining.

Thermodynamics 101

As anyone who has taken elementary physics knows, energy can’t be created or destroyed. In a closed ecosystem, the amount of energy stays the same, merely changing between high-quality and low-quality forms, namely heat. Low-quality heat energy can be an environmental pollutant, just like the plastic detritus with which our oceans are now awash.

Modern Life Is Power-Hungry

Ten years ago, the International Energy Agency (IEA) predicted the growth of global energy consumption from 15,665 terawatt hours (TWh) in 2006 to 28,142 TWh by 2030. According to a more recent IEA report, though, we are well ahead of schedule: by the end of 2017 we were already more than 75 percent of the way there.

There are good reasons for this acceleration in the rate at which we are burning resources. Modern cities cannot exist without a sprawling IT infrastructure. The services underpinned by these technologies make our lives better, safer, and more comfortable. But every single traffic light needs electricity to work—and nowadays that means not just a traffic light, but the whole supporting and coordinating system of sensors, processors, and servers. Multiply this across dozens of different systems and you have a serious problem.

GPUs and Global Warming

Power-hungry data centers are a major contributor to global warming, increasing the level of CO 2 in the atmosphere with every bit they flip. The increase in new data centers shows no signs of slowing: the global data center market is estimated to reach revenues of around $174 billion by 2023, growing at a CAGR of approximately 4 percent.

Moreover, the type of data centers being built has changed. One of the most significant trends is towards graphics processing unit (GPU) mining farms. GPUs are used heavily by the gaming industry due to their ability to manipulate images and output them efficiently, but they have also found diverse applications beyond gaming. Large arrays of GPUs are used in specialist server farms for many computationally-intensive tasks involved in artificial intelligence and neural network applications. Another application is UC Berkeley’s Search for Extra-Terrestrial Intelligence (SETI) program, which uses GPUs to crunch data from radio telescopes.

In addition, GPUs are ideally suited to mining certain coins; one impact of the rise in cryptocurrency prices over the course of 2017 was a global shortage of GPUs. As demand for GPU cards rose with crypto prices, gamers and SETI found themselves short of processing power. Cryptocurrency mining alone results in the production of 33.9 kilotons of CO 2 per year.

Suffice it to say that the major graphics card companies are experiencing explosive growth, driven by a slew of high-tech applications. Due to the nature of the technologies used, the efficiency of even the best data centers doesn’t exceed 20 percent, which means the other 80 percent of energy is converted to heat without any further utility—and this brings a whole new world of problems to an already hot topic.

Possible Solutions and Mitigation Strategies

The biggest players in this sphere have recently started to consider how to solve this problem. One set of solutions revolves around where our power is consumed; futurologists have predicted we will one day start building our data centers in space, for example. Even now, Microsoft has placed servers in the depths of the sea off the coast of Scotland and IBM is building a water-cooled processor for its Zurich Aquasar supercomputer. These initiatives don’t solve the problem of dirty heat, but they do make it cheaper for large corporations to service their infrastructure and optimize conditions for their hardware.

Most companies can’t afford to locate their servers on the sea bed, but Germany-based startup Comino has proposed the next-best thing: a liquid cooling system that makes it possible to capture vastly more heat and recycle a higher proportion of waste energy back into the host facility.

More specifically, Comino has adopted this approach for GPU supercomputers and servers, which is an acute need as key technologies from blockchain to AI reach a point of maturity and mass adoption. The company has already built two data centers in Europe with its unique liquid cooling technology. Its pilot project, Comino Grando, was launched in Sweden with a capacity of almost 5,000 kW and at a cost of around $30 million. By recycling energy, Grando has proven to be 40 percent more efficient than air-cooled supercomputers.

Comino’s CEO, Eugeny Vlasov, told me in an interview, “Liquid cooling systems are used to reduce PUE (Power Usage Effectiveness) to 1.05, which is the main measure of the eco-friendliness of such systems. It’s strange, but unfortunately tech companies don’t always think of using the latest and greatest technologies to reduce their environmental impact. We have a strong engineering team that works on integrating advanced tech solutions to deliver cost-efficient management of the extra heat that plagues many data centers.”

Solutions like Comino’s aim to make the entire world more eco-friendly, even as the pace of tech growth shows no signs of slowing. “Current tasks like cryptocurrency mining, neural networks, and GPU databases eat up massive amounts of energy. Crypto mining alone consumes more energy than Denmark, twice what it needs to be, with no heat recovery. We can do much, much better,” Vlasov said.

By the middle of the next decade, data centers are projected to account for almost 10 percent of total global electricity consumption. In an age of carbon credits and green incentives, environmental concerns are typically treated as a bolt-on extra, something companies do because they have to, not because it makes sense for their bottom lines. The good news for businesses is that in this instance, environmental benefits align with financial ones, since lower energy use reduces capital costs and operating expenses. The technology works for machine learning, cutting-edge database work or rendering, smart cities, banks, manufacturing, providers, video streaming, neural networks, mining, or data centers—there are few sectors that do not stand to be revolutionized by water cooling.

Another strategy for reusing the waste energy from data centers is to use that energy to heat buildings; Amazon already employs this approach to heat its headquarters, for example. Dutch startup Nerdalize has begun trials of a solution for the domestic market: customers pay the company to install servers in their homes and receive free heating in exchange.

Of course, all of these developments in increasing energy efficiency won’t bring us to the ultimate goal of stopping pollution altogether. One day, the future will become reality and data centers, servers, and supercomputers will be placed outside Earth’s atmosphere in space. Only then will hardware heat pollution become a thing of the past. In the meantime, that doesn’t mean we should stop caring about our planet.

Image Credit: Eliro / Shutterstock.com