Suppose I wanted to convince people that temperature in the USA wasn’t going up, it was going down. What would I show? Let’s try yearly average temperature in the conterminous U.S., also known as the “lower 48 states” (I’ll just call it “USA”):



Well that won’t do. It shows that temperature has been rising, not falling. By the way, I’ve included two trend estimates. The blue straight line is a linear trend estimate and it’s going up. The red curvy line is a nonlinear trend estimate, it has gone up and down and up, and is now rising fast. Scary fast. That definitely won’t do.

But wait! The temperature shown is the mean temperature, which is the average of the high and low temperatures. What if I tried just low temperatures?

That won’t do either. Scary fast.

How about high temperatures?

That still won’t do, but it’s a little better. There’s a more pronounced hump in the 1930s — that’s the dustbowl era. Could I maybe make the most of that?

Let’s try this: look at high temperature during the different seasons of the year. After all, we know winter has been warming faster than summer, maybe summertime only — or maybe at least one of the seasons — will give a more useful “sucker people” picture. Here are the average high temperatures for all four seasons separately:

Now we’re getting somewhere! Summer high temperature has still been increasing overall, but that hump during the 1930s (the dust bowl era) is far more pronounced. Maybe I could make something of that?

Perhaps I could just get rid of some of the data I don’t like. I can’t get rid of the most recent stuff — then people will figure out I’m trying to sucker them. How about I get rid of some of the early stuff? I’ll start with 1918, instead of starting when the data actually start (1895). That leaves this:

Finally! I’ve got a graph that looks like there’s nothing to worry about, where the linear trend is so small you almost can’t tell it’s still (barely) rising, and I only had to pick one of 12 possible combinations (mean/high/low temperature over winter/spring/summer/autumn) and leave out the early data to get it. Clever.

Even so, the trend is still going up even if just barely. And that’s the linear trend; the nonlinear trend looks like it might be rising noticeably lately, maybe even getting close to as hot as the summer of the dust bowl era. Could I fix that?

Of course I can! Instead of using the USA temperature data from the “experts,” those people at NOAA (the National Oceanic and Atmospheric Administration) who think they’re so good at it just because they’ve spent decades studying all that “math” and learning how to do it “right,” I’ll just take the raw data and form a simple average. Those NOAA people will tell you that isn’t right, that over the years new stations have come online and old ones have retired so you have to take that into account. They’ll talk about fancy-schmancy math stuff like “area weighting.” That’s all just NOAA tricks, aren’t they just a bunch of frauds? We can completely ignore the fact that over the years the average location of all the contributing stations has moved slightly northward to colder territory:

Heck we can completely ignore everything that they’ve learned about how to do it right … mainly because if we just take a simple, naive average we’ll get what we want.

There’s a graph going around the internet from Steve Goddard a.k.a. Tony Heller, claiming to show that temperature in the U.S. has been declining, using only high temperatures, using only summertime temperatures, using only data since 1918, based on a simple average without taking into account new stations coming online or old stations retiring or area-weighting or any of that “expert” stuff:

Imagine that.

This blog is made possible by readers like you; join others by donating at My Wee Dragon.