John Christy: Climate Data Maven

Earlier this week, the Senate Environment & Public Works Committee held a hearing entitled “Update on the Latest Climate Change Science and Adaptation Measures.” Testimony by Dr. John Christy of the University of Alabama Huntsville is too valuable not to share with the millions (okay, hundreds) of folks who visit this site.

Christy is a data maven. He spends “tedious” weeks and months examining surface observations as well as weather balloon and satellite measurements to build “datasets from scratch to advance our understanding of what the climate is doing and why.” He uses the datasets “to test hypotheses of climate variability and change.” Yes, it’s called the scientific method, but much of what passes for climate science today is, in Christy’s words, “opinion, arguments from authority, dramatic press releases, and fuzzy notions of consensus generated by a preselected group.”

Increasingly, we hear experts blame global warming for bad weather. Most acknowledge that no single weather event can be attributed to global climate change. However, they contend, the pattern of recent events — the sheer number and severity of heat waves, wild fires, droughts, freak storms — is exactly what climate scientists have predicted and must be due to mankind’s fuelish ways. Such assertions, Christy shows, are not based on real data.

One way to measure trends in extreme weather is to compare the number of state record high and low temperatures by decade. Many more state high temperature records were set in the 1930s than in recent decades. Even more surprising, “since 1960, there have been more all-time cold records set than hot records in each decade.”

There is no discernible greenhouse “fingerprint” in these data.

One might object that state temperature records are not informative, because the number of data points — 50 — is so small. So Christy also investigated “the year-by-year numbers of daily all-time record high temperatures from a set of 970 weather stations with at least 80 years of record.” He explains: “There are 365 opportunities in each year (366 in leap years) for each of the 970 stations to set a record high (TMax).” Adding the TMax days by year, Christy found that there were several years with more than 6,000 record-setting highs before 1940 but none with record highs above 5,000 after 1954. “The clear evidence is that extreme high temperatures are not increasing in frequency, but actually appear to be decreasing.”

Since climate change is a long-term phenomenon, Christy also calculates the number of record highs in 10-year moving averages. The figure below shows the trend line based on 704 stations that have at least 100 years of data.

Christy comments: “Note that the value for the most recent decade is less than half of what was observed in the 1930s.”

What about the heat wave of 2012 — isn’t it worse than any other year in the instrumental record? No. The graph below shows the number of record high temperatures for stations in “7 Central-US states where the heat is worst (AR-IL-IN-IA-MO-NE) and stations on the West Coast (CA-OR-WA).” For both the Central-US and West Coast, the largest number of TMax days occurred during the heat waves of 1911 and the 1930s. Although the Central-US has a large number of TMax days in 2012, the West Coast has very few, indicating that the current heat wave “is smaller than previous events.”

What about the current drought — is it part of a long-term trend that correlates with the ongoing rise in atmospheric greenhouse gas concentrations? The graph below shows the month-by-month percentage of the area of the U.S. classified as moderate to extreme for dryness and wetness by the National Oceanic and Atmospheric Administration (NOAA). Since 1900, there has been a high degree of year-to-year variability but “no long-term trend.”

Over a longer time frame, we find even greater climate variability. The photo below shows that trees grew on dry ground about 900 years ago “in what is now a Sierra Nevada alpine lake.” Christy comments: “This indicates that a drastic but natural change to a much drier climate must have lasted for at least a century for trees to have grown to these sizes on dry ground.”

A 500-year reconstruction of moisture in the upper Colorado River basin indicates that the 20th century was quite moist compared to the four prior centuries, all of which experienced multi-decadal droughts.

Source: Piechota et al. 2004

Christy emphasizes that he is not using these data to prove that U.S. weather is becoming less extreme or colder. Rather, his point is that “extreme events are poor metrics to use for detecting climate change.”

Christy’s testimony addresses other critical issues in climate science and policy. Here, I’ll briefly summarize just two points.

(1) Popular surface datasets are not reliable indicators of the greenhouse effect. Land use changes (urbanization, farming, deforestation) “disrupt the normal formation of the shallow, surface layer of cooler air during the night when TMin [daily low temperature] is measured.” Over time, TMin gets warmer, producing a trend easily mistaken for a global atmospheric phenomenon. That is one reason Christy has devoted much of his career to developing a satellite record of global temperatures. Satellite datasets “are not affected by these surface problems and more directly represent the heat content of the atmosphere.”

(2) Satellite data indicate that IPCC climate models are too sensitive and project too much warming. The graph below shows the results from 34 of the climate model simulations of global temperatures that will be included in the IPCC’s forthcoming Fifth Assessment Report. The thick black line shows the average model projection. The circles show the observed results from the two main satellite-based monitoring systems.