The combustion of oil, gas, and coal have made possible a much higher standard of living for humans through radical innovations in technology and science over the past 150 years. Yet for decades, scientists have provided clear evidence that carbon emissions from burning fossil fuels are imperiling our species and many others.

And now the evidence indicates, according to the latest Intergovernmental Panel on Climate Change report, the window may be closing on the opportunity to limit the damage.

As a historian who has studied the oil industry’s earliest years and petroleum’s role in world history, I believe that keeping the world habitable for future generations will depend on a swift transition to more sustainable energy sources. Unlike past transitions, the current one is at least partly driven by the recognition that stabilizing the climate requires a new mix of energy sources. It is an opportunity to make our energy in smarter ways and with less waste.

A Fossil-Fueled Society

Energy transitions are no simple flip of a new switch following the discovery or adoption of technology. For instance, for about 25 years after 1890, America’s roadways were a wild laboratory of various conveyances. From the horse-drawn buggy to the bicycle, from the Stanley Steamer to the Model T, devices serving the same purpose (including the first electric cars) derived energy from different sources including coal, horsepower, and gasoline.

Competition and influence determined that the internal-combustion engine would power autos of the future. However, public will and political decisions also played important roles, as did zoning ordinances and other laws. Americans determined that the 20th century would be powered by fossil fuels such as petroleum. The marketplace provided them with flexibility to create a landscape of drive-thrus and filling stations.

Similarly, consider the changes to how people illuminated their homes, businesses, and public places.

Between 1850 and 1900, Americans mostly did that with oils and candles rendered from the fat of farm animals and whales, as well as from burning kerosene made first from coal and then petroleum. By the early 1900s, most American lighting was powered by electricity, initially generated from burning coal. Later in the century, that power came from a mixture of coal, natural gas, hydropower, and nuclear energy. Starting around 2000, the use of wind and solar energy began to climb.

The same kinds of transformations occurred with heating and manufacturing. Cheap electricity, gasoline, and diesel together produced the massive amounts of power and flexibility that completely changed the human condition in the 20th century.

Rethinking Energy

Fossil fuels and nuclear reactors made it possible to do more work and accomplish more than ever before in human history. Now, another energy transition beckons.

Wind, solar, and other forms of renewable energy, paired with increased efficiency and vast amounts of storage, do not necessarily promise more power. But relying on them does point toward a more sustainable future.

I believe that this revolution requires new ways of thinking about energy that date back to the global energy crisis of the 1970s, a time of temporary oil shortages caused by Middle Eastern nations’ political discontent with Western nations. Long lines at gas stations and other inconveniences fueled a national conversation about conservation.

In 1977, Jimmy Carter made a memorable call for “the moral equivalent of war” on energy waste, and said “we must start now to develop the new, unconventional sources of energy we will rely on in the next century.”

President Jimmy Carter predicted in April 1977 that decisions about energy would “test the character of the American people.”

It had become clear all around that fossil fuels were not infinite resources. A matter of national security since World War I, energy supplies became a geopolitical touchstone of preeminent consideration in the relations between nations.

After 1980, growing awareness of the hazards posed by climate change introduced new criteria by which to select new power sources and phase out old ones. Pollution had been an obvious side effect of burning fossil fuels from the start due to smog and spills. But despite some early postulation, most scientists did not initially realize that this pollution was interfering with Earth’s basic functions.

As a result, in addition to considering price, supply and output, energy sources now must be judged for the carbon that they put into the atmosphere. Under such scrutiny, and thanks to innovation and market forces, fossil fuels are no longer cheaper than solar, wind, and geothermal alternatives in a growing number of locations.

Energy accounting is beginning to change, particularly in parts of the nation and the world where carbon is capped and traded, and countries with carbon taxes in effect.

No Choice in the Long Run

Human energy use has transitioned more or less constantly since we developed the ability to control fire. Historians have long observed that when nations resist these transitions, they can fall behind for an entire generation or more.

For instance, Chinese sailors began the Age of Sail when big ships first harnessed the power of the wind to widen the scope of exploration, trade and warfare in the 1500s. But then China essentially sat idly by, watching while other nations wove this innovation into a new global economy.

Similarly, huge technological leaps are now galloping forward from the assumption that climate change—with its increased temperatures, erratic weather patterns, melting ice caps, rising seas, and the heightened intensity and frequency of storms—demands new ways of thinking about energy.

And any nation that fails to accept this new reality may find itself quickly outmoded.

Brian C. Black, Distinguished Professor of History and Environmental Studies, Pennsylvania State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: lassedesignen / Shutterstock.com