For indispensable reporting on the coronavirus crisis, the election, and more, subscribe to the Mother Jones Daily newsletter.





This story first appeared on the Atlantic website and is reproduced here as part of the Climate Desk collaboration.

Let’s take a quick tour of how Americans use energy at home. Per capita energy consumption has stayed fairly stable over the past thirty years, but how we use energy has changed.

Insulation improvements and efficiency gains in heating and cooling have made the task of temperature management less energy-intensive. And these improvements have been offset by the proliferation of electronic appliances and gadgets.

While appliances and electronics have grown in their share of total energy consumption, the single biggest energy drain remains heating, as well as cooling in warmer climes.

Since temperature regulation is very energy-intensive, regional trends explain much of the change in the residential energy picture—as America’s population shifts towards the South and West, heating becomes less important, but cooling more so. The coasts also consume much less energy per capita than inland America.

Heating tends to use more energy than cooling, and in residential heating the energy usually comes from natural gas or electricity. Fuel oil is still used as a heat source in the northeast, but 85 percent of households across America heating systems are either electrical or gas-powered.

The fuel of choice has changed considerably driven by America’s demographic shifts: electricity is most commonly used in the South , and the pivot towards the South and West meant electricity’s role in heating homes increased substantially during the last quarter of the 20th century. As more people move South, electricity’s share may continue to increase.

This trend may not be the best when it comes to paying bills—electric heating is hugely energy-intensive, and a study conducted by an Austin-based research group found that it may be cheaper for households to use gas instead of electricity in appliances that consume lots of energy, such as dryers and ovens.

Appliances vary considerably in the amount of electricity they require, but one way to measure their consumption is through “wattage“—the maximum power of the appliance.

Electric water heaters, dishwashers, and clothes dryers are some of the highest wattage items—meaning that they draw a lot of electricity when running—and the percentage of households with such appliances has hugely increased over the past thirty years.

However, wattage is misleading as a guide to total consumption, since some items are run infrequently and use less electricity than a lower wattage appliance that is permanently on. When measuring overall consumption, refrigerators are revealed as one of the biggest culprits guzzling energy, despite massive efficiency improvements since the 1980s. The California Energy Commission reports that today’s refrigerators “use 60 percent less electricity on average than 20-year-old models.”

Taken overall, though, the higher number of appliances and gadgets has more than offset the efficiency improvements in both individual appliances and in space heating. This phenomenon is known as the “rebound effect,” revisited by David Owens in his piece in the New Yorker, and based on an argument put forward in the mid-19th century, called the “Jevons paradox.”

The theory suggests that efficiency savings can never reduce energy consumption, because the money a household saves on energy bills will be used to buy other energy-intensive products.

If correct, this would suggest policies to increase the cost of energy are just as important as improving efficiency. However, Owens’ argument has also been criticized by energy wonks who argue that the phenomenon may be in some part true, but will not come close to fully canceling out efficiency gains.

Either way, it seems that it’s one thing to reach a plateau of energy consumption, but quite another to begin reducing it altogether.