Driven by winds up to 45 miles per hour, the Galena fire west of Fort Collins, Colorado spread to about 1,000 acres on Friday night. Fire season out here in the West is supposed to begin in May. But that didn't stop a nasty wildfire from blowing up yesterday about 45 minutes north of where I live. With continuing drought, low snowpack, record high temperatures in the mid-70s, and strong winds gusting as high as 45 miles per hour, all it took was an ignition to get a wildfire off and running. By nightfall the fire had spread to about 1,000 acres, and as I write this it is only 5 percent contained. Although authorities have not yet provided the specifics, they have reported that it was human-caused. The Galena Fire is probably a harbinger of another bad fire season to come, following close on the heels of 2012, which was horrific. Last year, in fact, wildfires burned over 9.2 million acres across the United States — the third highest total since the year 2000. With all of this in mind, I've put together a series of graphics intended to provide some context to what's happening. Large parts of the West are experiencing significant drought right now. Here in Colorado, 100 percent of the state is in drought:

Source: U.S. Drought Monitor And in all of Larimer County, where the Galena Fire is located, drought conditions are categorized as severe. Given the dearth of precipitation, it should come as no surprise that snowpack is alarmingly low in the mountains west of Fort Collins, and in Colorado generally:

Statewide, colorado snowpack stands at just 74 percent of average for this time of year. (Source: Natural Resources Conservation Service.) There's a broader context to what's happening as well. According to a landmark paper in Science by Anthony Westerling and colleagues:

. . . large wildfire activity increased suddenly and markedly in the mid-1980s, with higher large-wildfire frequency, longer wildfire durations, and longer wildfire seasons. The greatest increases occurred in mid-elevation, Northern Rockies forests, where land-use histories have relatively little effect on fire risks and are strongly associated with increased spring and summer temperatures and an earlier spring snowmelt.

Here is a key graph from that research:

The red bars show the annual frequency of large western U.S. forest wildfires, meaning larger than 400 hectares. The black line depicts the mean March through August temperature for the western United States. (Source: Westerling et al — http://www.sciencemag.org/content/313/5789/940.abstract) Recent work also suggests that what we're observing now in the West is actually a form of payback — the enhanced fire activity we've been seeing in recent years may actually be paying back a "fire deficit" that accumulated during the 20th century, thanks to fire suppression and other human activities. That research, by Jennifer Marlon and colleagues, was published last year in the Proceeding of the National Academy of Sciences. Here's a key graphic from the study:

Using charcoal records from lakes, researchers have reconstructed the rate of biomass burning in the western United States for the past 3,000 years. Burn rates in the 20th century were relatively low — as low as they were during the Little Ice Age, about 400 years ago. (Source: PNAS, vol. 109 no. 9, E535–E543) For all the details, see this story I wrote for The Daily Climate last June. Suffice it to say that fire activity in the West began dropping as the 19th century ended, and the trend continued into the 20th century. (You can see that trend to the very right of the graph above.) This drop occurred even as a strong signal of global warming from human activities emerged. Patrick Bartlein, a University of Oregon climatologist and co-author of the study, told me that "this divergence between climate and fire activity is unsustainable. Eventually, nature will catch up."