#DataGate: Hadley reply to first audit with foggy excuses about problems 2,000 staff didn’t find

Last week we exposed absurd errors, brutal adjustments and an almost complete lack of quality control (was there any at all?) in the key HadCRUT4 data. The IPCC’s favorite set is maintained (I’m feeling generous) by the Met Office Hadley Centre and the Uni of East Anglia’s CRU in the UK.

Finally the Hadley Met Centre team have replied to Graham Lloyd regarding John McLean’s audit. They don’t confirm or discount any of his new claims specifically. But they acknowledge his previous notifications were useful in 2016, and promise “any errors will be fixed in the next update.” That’s nice to know, but begs the question of why a PhD student working from home can find mistakes that the £226 million institute with 2,100 employees could not.

They don’t mention the killer issue of the adjustments for site-moves at all — that’s the cumulative cooling of the oldest records to compensate for buildings that probably weren’t built there ’til decades later.

Otherwise this is the usual PR fog — a few outliers don’t change the trend, the world is warming, and other datasets show “similar trends“. The elephant in the kitchen is the site move adjustments which do change the trend which they didn’t mention.

And while the absurd outliers may not change the trend (we don’t know yet) the message from frozen tropical islands is terrible. These bizarre mistakes are like glowing hazard signs that the dataset is neglected, decaying, essentially junk. What else might be wrong? How do we reconcile the experts urgent insistence that climate change is the greatest threat to life on Earth but it’s not important enough to bother checking the data? We must pay trillions, turn vegetarian, and live in cold rooms, but the actual historic measurements are irrelevant. Were some numbers left in Fahrenheit for 40 years? Nevermind.

They claim that automated quality control checks are done, as are manual checks, but we are still wondering what that means when they haven’t even done a spelling check and nor bothered to filter out the freak outliers which are hotter than the hottest day on Earth. These kinds of checks are something that a 12 year old geek could write the code for.

The Met Office protests that the database includes “7 million points”, but then, they do have a supercomputer that can do 16,000 trillion calculations every second. The ten-nanosecond-test for the new World Record Temperature would have fished out the silliest mistakes, some of which have been there for decades.

They claim they are backed up by other datasets. but all the worlds temperature sets are juggling the same pool of measurements. If the shonky site-move adjustments start with national met bureaus, then get sent out around the world, all the global datasets combine the same mistakes and make similar overestimations.

Graham Lloyd, The Australian

Britain’s Met Office has welcomed an audit from Australian researcher John McLean that claims to have identified serious errors in its HadCRUT global temperature record.

“Any actual errors identified will be dealt with in the next major update.’’

The Met Office said automated quality checks were performed on the ocean data and monthly updates to the land data were subjected to a computer assisted manual quality control process.

“The HadCRUT dataset includes comprehensive uncertainty estimates in its estimates of global temperature,” the Met Office spokesman said.

“We previously acknowledged receipt of Dr John McLean’s 2016 report to us which dealt with the format of some ocean data files.

“We corrected the errors he then identified to us,” the Met Office spokesman said.

VN:F [1.9.22_1171]

please wait... Rating: 9.8/10 (104 votes cast)