Now that 2011 is complete, most of the major global temperature estimates have updated their data to include the complete calendar year. The only one which hasn’t yet is HadCRU, for which data are available through November of 2011 but December’s estimate is not yet online.



It turns out that 2011 wasn’t as hot as 2010, as can be seen from a graph of the raw data:

For GISS data, 2011 was the 9th-hottest year on record, for NCDC it was 11th-hottest, for HadCRU (using only data through November) it was 12th-hottest, for RSS it was 12th-hottest, and for UAH 9th-hottest.

Some interpret last year’s data as “Global temps in a Crash.” Of course they do so without any analysis, based only on the fact that recent numbers are lower than previous numbers. And they don’t properly account for all those other factors that influence temperature, especially the fact that 2011 was a strong la Nina year so we expect it would be on the cool side of the prevailing trend.

But we can estimate, and remove, the influence of exogenous factors, including el Nino, aerosols, and solar variations. I’ve done this before, and it demostrates that the man-made influence on temperature is creating a sizeable and inexorable trend. For the updated results, rather than use TSI (total solar irradiance) to represent solar variations I used sunspot counts, because they’re easy to get, kept up-to-date, and already available as monthly averages. Here is the adjusted temperature data, with the exogenous influence removed:

When the other factors are accounted for, the continuing trend is obvious — and any talk of “Global temps in a Crash” is shown to be the idiocy that it is.

In the adjusted data, GISS ranks 2011 as the 2nd-hottest year on record (just behind 2010), NCDC ranks it 5th, HadCRU (using only data through November 2010) ranks it 5th, RSS ranks it 2nd, and UAH 2nd. No, global temps are not in a crash — they still fluctuate (for a lot of reasons, including exogenous factors) but the trend continues. It’s called “global warming.” It’s caused by human activity. It’s dangerous.

We can even average adjusted data for the 5 major global temperature records to give us a composite estimate of the man-made component of global warming:

2011 is second only to 2010 as hottest on record. The trend continues.

For those of you who wish to keep current with this analysis, I’ve modified the programs to use sunspot counts rather than TSI to represent solar variations, and I’ve written a program to download and format the necessary data from the internet automatically. It’s in this package:

NewSoft

Since wordpress won’t allow me to upload a “zip” file, I changed the name from “NewSoft.zip” to “NewSoft.xls” — after downloading, you should change the name back to “NewSoft.zip” then unzip the file. It contains two R programs and some data files, but all you really need are the R programs. One of them is “datagetform.r” — when you run that it will automatically download and format all the data, creating a master data file called “rawdata.dat”. That file is input to the other program, “allfit.r”, which will compute the influence of exogenous factors, then create two files: “Adjusted.dat” is the adjusted temperature data and “coeffs.txt” is the regression coefficients.

These programs come with no support and no guarantee, so if they transform your computer into a super-intelligent destroyer of worlds intent on removing the human infestation from the planet, I disavow any responsibility.