The press release is out, and the usual serial bloviators are rushing to trumpet the news. July 2012 was the hottest ever on record! “Yikes! We’re gonna roast! Global Warming!” they wail on Twitter and blogs. The driver of this is AP’s Seth Borenstein, who never met a hot story he didn’t like. Here’s a quote from that story Ouch July in US was hottest ever in history books:

The average temperature last month was 77.6 degrees. That breaks the old record from July 1936 by 0.2 degree, according to the National Oceanic and Atmospheric Administration. Records go back to 1895. … Three of the nation’s five hottest months on record have been recent Julys: This year, 2011 and 2006. Julys in 1936 and 1934 round out the top five.

Of course the first thing I do when I see these sorts of things is go look at the data. It tells a far more interesting and credible story. Here’s some graphs NCDC and Seth won’t ever put in a press release or AP story:

From NCDC’s Climate at a glance page:

Now let’s compare to July 1936:

A few things stand out right away.

1. Due to regional weather pattern variability, one state in 1936 had below normal temperatures, Texas. Take the 1936 Texas below normal temperature out of the mix and there goes your 0.2F record making difference with July 2012.

2. Many states had warmer temperatures in 1936 than 2012. Here’s a table, all numbers in degrees Fahrenheit:

State 1936 2012 Montana 74.7 71.4 N. Dakota 79.7 73.8 S. Dakota 83.8 78.8 Minnesota 76.2 74.4 Wisconsin 74.8 74.7 Nebraska 83.1 80.0 Iowa 82.7 79.4 Kansas 85.1 84.3 Oklahoma 85.8 85.5 Missouri 84.9 83.7 Illinois 83.1 81.7 Indiana 80.9 80.2 Mississippi 82.0 81.8 California 76.3 75.0

Now compare that to the same map of 1934, and we also see many warmer states than in 2012.

What’s interesting is that that if AGW had overcome natural variability, and many claim this, we wouldn’t see any statewide temperatures in 2012 lower than in 1936 or 1934.

And with all the adjustments that have been going on, which 1930’s are we really talking about? The real one or the adjusted one? NASA GISS uses NCDC adjusted data, which according to this graph from Steve Goddard, suggests there’s been a whole lot of adjusting going on.

The graph below shows the almost two degree US upwards adjustment trend being applied by USHCN between the raw thermometer data and the published monthly data.

The adjustments they are making are greater than the claimed trend, meaning that all man made US warming is occurring inside ORNL and GISS computers.

Speaking of adjustments, I recalled the GISS Y2K debacle in 2007 where McIntyre discovered a mistake in GISTEMP. I’ve recovered the graphs from Hansen’s 1999 press release. This was originally part of “Lights Out Upstairs”a guest post by Steve McIntyre on my old original blog. Just look at how much warmer 1934 was in 1999 than it is now. Much of this can be attributed to NCDC’s USHCNv2 adjustments.

=============================================================

Steve McIntyre wrote then:

In the NASA press release in 1999 , Hansen was very strongly for 1934. He said then:

The U.S. has warmed during the past century, but the warming hardly exceeds year-to-year variability.Indeed, in the U.S. the warmest decade was the 1930s and the warmest year was 1934.

This was illustrated with the following depiction of US temperature history, showing that 1934 was almost 0.6 deg C warmer than 1998.

From a Hansen 1999 News Release: http://www.giss.nasa.gov/research/briefs/hansen_07/fig1x.gif

However within only two years, this relationship had changed dramatically. In Hansen et al 2001 (referred to in the Lights On letter), 1934 and 1998 were in a virtual dead heat with 1934 in a slight lead. Hansen et al 2001 said

The U.S. annual (January-December) mean temperature is slightly warmer in 1934 than in 1998 in the GISS analysis (Plate 6)… the difference between 1934 and 1998 mean temperatures is a few hundredths of a degree.

From Hansen et al 2001 Plate 2. Note the change in relationship between 1934 and 1998.

Between 2001 and 2007, for some reason, as noted above, the ranks changed slightly with 1998 creeping into a slight lead.

The main reason for the changes were the incorporation of an additional layer of USHCN adjustments by Karl et al overlaying the time-of-observation adjustments already incorporated into Hansen et al 1999. Indeed, the validity and statistical justification of these USHCN adjustments is an important outstanding issue.

============================================================

I’ve prepared a before and after graph using the CONUS values from GISS in 1999 and in 2011 (today).

GISS writes now of the bottom figure:

Annual Mean Temperature Change in the United States

Annual and five-year running mean surface air temperature in the contiguous 48 United States (1.6% of the Earth’s surface) relative to the 1951-1980 mean. [This is an update of Figure 6 in Hansen et al. (1999).]

Also available as PDF, or Postscript. Also available are tabular data.

So clearly, the two graphs are linked, and 1998 and 1934 have swapped positions for the “warmest year”. 1934 went down by about 0.3°C while 1998 went up by about 0.4°C for a total of about 0.7°C.

And they wonder why we don’t trust the surface temperature data.

In fairness, most of this is the fault of NCDC’s Karl, Menne, and Peterson, who have applied new adjustments in the form of USHCN2 (for US data) and GHCN3 (to global data). These adjustments are the primary source of this revisionism. As Steve McIntyre often says: “You have to watch the pea under the thimble with these guys”.

So the real question is: which 1934 and 1936 is NCDC and Seth Borenstein comparing to? It looks to me like we might not be comparing real temperatures to real temperatures, but rather adjusted ones to highly adjusted ones.

Finally, remember this statement from the AP July 2012 “hottest ever” story:

The average temperature last month was 77.6 degrees.

I have a way to apply a sanity check to this. but I’ll need some crowd-sourcing help. Stay tuned.

==========================================

UPDATE: Dr. Roy Spencer makes an interesting plot, which I’ve annotated to show a color key and years 1934, 1936, and 2012.

He writes in: July 2012 Hottest Ever in the U.S.? Hmmm….I Doubt It

Using NCDC’s own data (USHCN, Version 2), and computing area averages for the last 100 years of Julys over the 48 contiguous states, here’s what I get for the daily High temps, Low temps, and daily Averages (click for large version):

As far as daily HIGH temperatures go, 1936 was the clear winner. But because daily LOW temperatures have risen so much, the daily AVERAGE July temperature in 2012 barely edged out 1936.

…

So, all things considered (including unresolved issues about urban heat island effects and other large corrections made to the USHCN data), I would say July was unusually warm. But the long-term integrity of the USHCN dataset depends upon so many uncertain factors, I would say it’s a stretch to to call July 2012 a “record”.

Share this: Print

Email

Twitter

Facebook

Pinterest

LinkedIn

Reddit



Like this: Like Loading...