How do we know that climate change plays a role in the increasingly powerful wildfires? Although fire has always been a natural—and beneficial—part of many ecosystems, climate change and other human-caused factors are fundamentally changing the frequency and intensity of wildfires in many places in the US and around the world.

Western US wildfires on the rise

Wildfires in the western United States are getting worse.

While fire is a natural and essential part of these ecosystems, warming temperatures and drying soils—both tied to human-caused climate change—have contributed to observed increases in wildfire activity. The earlier snowmelt and higher temperatures—and resulting drier soils from increased evaporation—in addition to greater water loss from vegetation have contributed to lengthening the Western fire seasons. Leaders at CalFire even suggest there’s not a wildfire “season” at all anymore, as California in recent years has been battling blazes year-round.

Factors unrelated to climate change affect wildfire risk as well. Past fire suppression and forest management practices have also led to a build-up of flammable fuel wood, which increases wildfire risks. The risk to people and property is also rising because of the increasing number of homes and businesses being built in and near wildfire-prone areas known as the “wildland-urban interface.”

In addition, increased tree mortality due to bark beetle infestation—which has underlying climate drivers—has also modified landscapes in ways that make them more likely to burn. Multi-year drought and precipitation patterns also contribute to the growth of low vegetation that is prone to combustion when dry, serving as kindling for larger fires.