Nowadays we can find a load of facts in the Internet. Example: forest fires in the USA. The size of the areas ravaged by forest fires is provided by a table from the National Interagency Fire Centers . Strangely the data are not offered in graphical form. You are forced to make your own, which is no problem. Most people however simply are left in the dark. Steven Goddard (Tony Heller) shows such a charts at his Real Science blog .

And when the facts indeed do contradict their alarmist claims, they get personal. They attack the occupation of the skeptic, or education, or skin color, or, or, or.

Thus they prefer to claim something and hope that nobody will bother to fact check the claim. They don’t like climate skeptics because they have the silly habit of carefully examining the facts. They prefer the silent, non-questioning audience who immediately say yes and amen in response to all alarmist claims.

Droughts increase the risk of forest fires; that’s logical. However it is false to reflexively assign every forest fire to climate change. There have always been droughts and forest fires. Anyone wishing to shift the blame over to climate change first has to show that the trend has already deviated from the range of natural variability. For many, that is simply too much work.

2004 – 2014 burn acreage trend is falling. Chart source: Tony Heller.

One cannot always just pull climate change at of his magic hat every time a forest fire appears. The University of Colorado at Boulder recently calculated that 84% of all forest and bush fires in den USA are caused by humans. Read the press release from February 2017:

Humans have dramatically increased extent, duration of wildfire season Humans have dramatically increased the spatial and seasonal extent of wildfires across the U.S. in recent decades and ignited more than 840,000 blazes in the spring, fall and winter seasons over a 21-year period, according to new University of Colorado Boulder-led research. After analyzing two decades’ worth of U.S. government agency wildfire records spanning 1992-2012, the researchers found that human-ignited wildfires accounted for 84 percent of all wildfires , tripling the length of the average fire season and accounting for nearly half of the total acreage burned. The findings were published today in the journal Proceedings of the National Academy of Sciences.

The most common day for human-started fire by far, however, was July 4, with 7,762 total wildfires started on that day over the course of the 21-year period. The new findings have wide-ranging implications for fire management policy and suggest that human behavior can have dramatic impact on wildfire totals, for good or for ill. “The hopeful news here is that we could, in theory, reduce human-started wildfires in the medium term,” said Balch. “But at the same time, we also need to focus on living more sustainably with fire by shifting the human contribution to ignitions to more controlled, well-managed burns.” Co-authors of the new research include Emily Fusco of the University of Massachusetts Amherst and Adam Mahood and Chelsea Nagy of CU Boulder. The research was funded by the NASA Terrestrial Ecology Program, the Joint Fire Sciences Program and Earth Lab through CU Boulder’s Grand Challenge Initiative .”

While lightning-driven fires tend to be heavily concentrated in the summer months, human-ignited fires were found to be more evenly distributed across all seasons. Overall, humans added an average of 40,000 wildfires during the spring, fall and winter seasons annually—over 35 times the number of lightning-started fires in those seasons. “We saw significant increases in the numbers of large, human-started fires over time, especially in the spring,” said Bethany Bradley, an associate professor at University of Massachusetts Amherst and co-lead author of the research. “I think that’s interesting, and scary, because it suggests that as spring seasons get warmer and earlier due to climate change, human ignitions are putting us at increasing risk of some of the largest, most damaging wildfires.” “Not all fire is bad, but humans are intentionally and unintentionally adding ignitions to the landscape in areas and seasons when natural ignitions are sparse,” said John Abatzoglou, an associate professor of geography at the University of Idaho and a co-author of the paper. “We can’t easily control how dry fuels get, or lightning, but we do have some control over human started ignitions.”

The CU Boulder researchers used the U.S. Forest Service Fire Program Analysis-Fire Occurrence Database to study records of all wildfires that required a response from a state or federal agency between 1992 and 2012, omitting intentionally set prescribed burns and managed agricultural fires. Human-ignited wildfires accounted for 84 percent of 1.5 million total wildfires studied, with lightning-ignited fires accounting for the rest. In Colorado, 30 percent of wildfires from 1992-2012 were started by people, burning over 1.2 million acres. The fire season length for human-started fires was 50 days longer than the lightning-started fire season (93 days compared to 43 days), a twofold increase. “These findings do not discount the ongoing role of climate change, but instead suggest we should be most concerned about where it overlaps with human impact,” said Balch. “Climate change is making our fields, forests and grasslands drier and hotter for longer periods, creating a greater window of opportunity for human-related ignitions to start wildfires.”

“There cannot be a fire without a spark,” said Jennifer Balch, Director of CU Boulder’s Earth Lab and an assistant professor in the Department of Geography and lead author of the new study. “Our results highlight the importance of considering where the ignitions that start wildfires come from, instead of focusing only on the fuel that carries fire or the weather that helps it spread. Thanks to people, the wildfire season is almost year-round.” The U.S. has experienced some of its largest wildfires on record over the past decade, especially in the western half of the country. The duration and intensity of future wildfire seasons is a point of national concern given the potentially severe impact on agriculture, ecosystems, recreation and other economic sectors, as well as the high cost of extinguishing blazes. The annual cost of fighting wildfires in the U.S. has exceeded $2 billion in recent years.

In July 2017 the Institute for Basic Science explained that the risk of forest fires on the US Southwest was strongly dependent on the temperature differences between the Pacific and Atlantic Oceans. Ultimately the ocean cycles are the real drivers. Press release (via Science Daily):

Bringing together observed and simulated measurements on ocean temperatures, atmospheric pressure, water soil and wildfire occurrences, the researchers have a powerful tool in their hands, which they are willing to test in other regions of the world: ‘Using the same climate model configuration, we will also study the soil water and fire risk predictability in other parts of our world, such as the Mediterranean, Australia or parts of Asia,’ concludes Timmermann. ‘Our team is looking forward to developing new applications with stakeholder groups that can benefit from better soil water forecasts or assessments in future fire risk.’

The new drought and wildfire predictability system developed by the authors expands beyond the typical timescale of seasonal climate forecast models, used for instance in El Niño predictions. It was tested with a 10-23 month forecasting time for wildfire and 10-45 for drought. ‘Of course, we cannot predict individual rainstorms in California and their local impacts months or seasons ahead, but we can use our climate computer model to determine whether on average the next year will have drier or wetter soils or more or less wildfires. Our yearly forecasts are far better than chance,’ states Lowell Stott, co-author of the study from the University of Southern California in Los Angeles.

The new findings show that a warm Atlantic and a relatively cold Pacific enhance the risk for drought and wildfire in the southwestern US. ‘According to our study, the Atlantic/Pacific temperature difference shows pronounced variations on timescales of more than 5 years. Like swings of a very slow pendulum, this implies that there is predictability in the large-scale atmosphere/ocean system, which we expect will have a substantial societal benefit,’ explains Yoshimitsu Chikamoto, lead author of the study and Assistant Professor at the University of Utah in Logan.

‘Our results document that a combination of processes is at work. Through an ensemble modeling approach, we were able to show that without anthropogenic effects, the droughts in the southwestern United States would have been less severe,’ says co-author Axel Timmermann, Director of the newly founded IBS Center for Climate Physics, within the Institute for Basics Science (IBS), and Distinguished Professor at Pusan National University in South Korea. ‘By prescribing the effects of human-made climate change and observed global ocean temperatures, our model can reproduce the observed shifts in weather patterns and wildfire occurrences.’

Over the past 15 years, California and neighboring regions have experienced heightened drought conditions and an increase in wildfire numbers with considerable impacts on human livelihoods, agriculture, and terrestrial ecosystems. This new research shows that in addition to a discernible contribution from natural forcings and human-induced global warming, the large-scale difference between Atlantic and Pacific ocean temperatures plays a fundamental role in causing droughts, and enhancing wildfire risks.

An international team of climate researchers from the US, South Korea and the UK has developed a new wildfire and drought prediction model for southwestern North America. Extending far beyond the current seasonal forecast, this study published in the journal Scientific Reports could benefit the economies with a variety of applications in agriculture, water management and forestry.

Ocean cycles (El Nino, La Nina) were also identified by Mason et al. 2017 as the forest fire drivers in the USA:

Effects of climate oscillations on wildland fire potential in the continental United States

The effects of climate oscillations on spatial and temporal variations in wildland fire potential in the continental U.S. are examined from 1979 to 2015 using cyclostationary empirical orthogonal functions (CSEOFs). The CSEOF analysis isolates effects associated with the modulated annual cycle and the El Niño–Southern Oscillation (ENSO). The results show that, in early summer, wildland fire potential is reduced in the southwest during El Niño but is increased in the northwest, with opposite trends for La Niña. In late summer, El Niño is associated with increased wildland fire potential in the southwest. Relative to the mean, the largest impacts of ENSO are observed in the northwest and southeast. Climate impacts on fire potential due to ENSO are found to be most closely associated with variations in relative humidity. The connections established here between fire potential and climate oscillations could result in improved wildland fire risk assessment and resource allocation.”

El Nino also plays a large role in the US Northwest for controlling driving forst fires, according to Barbero et al. 2015:

Seasonal reversal of the influence of El Niño–Southern Oscillation on very large wildfire occurrence in the interior northwestern United States

Satellite-mapped fire perimeters and the multivariate El Niño–Southern Oscillation index were used to examine the impact of concurrent El Niño–Southern Oscillation (ENSO) phase on very large fire (VLF) occurrences over the intermountain northwestern United States (U.S.) from 1984 to 2012. While the warm phase of ENSO promotes drier and warmer than normal conditions across the region during winter and spring that favor widespread fire activity the following summer, a reduction in VLFs was found during the warm phase of ENSO during summer concurrent with the fire season. This paradox is primarily tied to an anomalous upper level trough over the western U.S. and positive anomalies in integrated water vapor that extend over the northwestern U.S. during summers when the warm phase of ENSO is present. Collectively, these features result in widespread increases in precipitation amount during the summer and a curtailment of periods of critically low-fuel moistures that can carry wildfire.”

Overall forest fires in the USA have decreased significantly compared to the previous century (see article by Larry Kummer at Fabius Maximus).

In Colorado no forest fire trend could be found over the past centuries, see the press release from the University of Colorado issued in 2014:

Colorado’s Front Range fire severity today not much different than in past, says CU-Boulder study

The perception that Colorado’s Front Range wildfires are becoming increasingly severe does not hold much water scientifically, according to a massive new study led by the University of Colorado Boulder and Humboldt State University in Arcata, Calif. The study authors, who looked at 1.3 million acres of ponderosa pine and mixed conifer forest from Teller County west of Colorado Springs through Larimer County west and north of Fort Collins, reconstructed the timing and severity of past fires using fire-scarred trees and tree-ring data going back to the 1600s. Only 16 percent of the study area showed a shift from historically low-severity fires to severe, potential crown fires that can jump from treetop to treetop. The idea that modern fires are larger and more severe as a result of fire suppression that allowed forest fuels to build up in the past century is still prevalent among some, said CU-Boulder geography Professor Thomas Veblen, a study co-author. ‘The key point here is that modern fires in these Front Range forests are not radically different from the fire severity of the region prior to any effects of fire suppression,’ he said. A paper on the subject was published Sept. 24 in the journal PLOS ONE. The study was led by Associate Professor Rosemary Sherriff of Humboldt State University and involved Research Scientist Tania Schoennagel of CU-Boulder’s Institute of Arctic and Alpine Research, CU-Boulder doctoral student Meredith Gartner and Associate Professor Rutherford Platt of Gettysburg College in Gettysburg, Pa. The study was funded by the National Science Foundation. ‘The common assumption is that fires are now more severe and are killing higher percentages of trees,’ said Sherriff, who completed her doctorate at CU-Boulder under Veblen in 2004. ‘Our results show that this is not the case on the Front Range except for the lowest elevation forests and woodlands.’ One important new finding comes from a comparison of nine large fires that have occurred on the Front Range since 2000 — including the 2002 Hayman Fire southwest of Denver, the 2010 Fourmile Canyon Fire west of Boulder and the 2012 High Park Fire west of Fort Collins — with historic fire effects in the region. ‘It’s true that the Colorado Front Range has experienced a number of large fires recently,’ said Schoennagel. ‘While more area has burned recently compared to prior decades – with more homes coming into the line of fire – the severity of recent fires is not unprecedented when we look at fire records going back before the 1900s.’ In addition, tree-ring evidence from the new study shows there were several years on the Front Range since the 1650s when there were very large, severe fires. The authors looked at more than 1,200 fire-scarred tree samples and nearly 8,000 samples of tree ages at 232 forest sample sites from Teller County to Larimer County. The study is one of the largest of its kind ever undertaken in the western United States. The team was especially interested in fire records before about 1920, when effective fire suppression in the West began in earnest. ‘In relatively dry ponderosa pine forests of the West, a common assumption is that fires were relatively frequent and of low severity, and not lethal to most large trees, prior to fuel build-up in the 20th century,’ said Veblen. ‘But our study results showed that about 70 percent of the forest study area experienced a combination of moderate and high-severity fires in which large percentages of the mature trees were killed.’ Along the Front Range, especially at higher elevations, homeowners and fire managers should expect a number of high-severity fires unrelated to any kind of fire suppression and fuel build-up, said Schoennagel. ‘This matters because high-severity fires are dangerous to people, kill more trees and are trickier and more expensive to suppress.” “Severe fires are not new to most forests in this region,’ said Sherriff. ‘What is new is the expanded wildland-urban interface hazard to people and property and the high cost of suppressing fires for society.’ In addition, a warming Colorado climate — 2 degrees Fahrenheit since 1977 — has become a wild card regarding future Front Range fires, according to the team. While fires are dependent on ignition sources and can be dramatically influenced by high winds, the team expects to see a substantial increase in Front Range fire activity in the low and mid-elevations in the coming years as temperatures continue to warm, a result of rising greenhouses gases in Earth’s atmosphere.”

2016 was a bad year of forest fires in California. Al Gore immediately pointed the finger at climate change. But later it was discovered that a series of arsons was behind most of the fires. The house of climate alarm quickly collapsed. Also the University of Arizona found that the fires were promoted by poor land use practices. Press release: