

Richard Muller, a cantankerous but creative physicist at the University of California, Berkeley, who once derided climate change research, then dove in with his own reconstruction of terrestrial temperature changes and confirmed substantial warming, has now concluded that recent warming is “almost entirely” human caused.

He claims his new analysis, which has been posted* for public review but has not yet been peer reviewed (more on that below), provides an even firmer view of human-driven warming than the 2007 report by the Intergovernmental Panel on Climate Change. Here’s the general flow of events, which are — as Keith Kloor noted overnight — “great fodder for the long-running soap opera, ‘As the Climate World Turns.'”

Muller’s team last fall submitted four papers summarizing its review of a vast array of temperature records spanning two centuries to the journal JGR Atmospheres and posted them and supporting data and other material at the Berkeley Earth Surface Temperature, or BEST, Web site. (The papers have not been published yet and one of my first questions for Muller and his team now is have they been accepted?)

The team’s new strong conclusion about human-generated greenhouse gases driving recent warming is one of several findings in a fifth paper that Muller says is being submitted to the journal and posted on his Web site, as well [this afternoon].

Muller, who has combined P.T. Barnum showmanship and science throughout his three-year project, chose to break the news in an Op-Ed article in The Times (with various leaks and rumors percolating on the Web). There are perils in having publicity precede peer review. For hints of how this could backfire, read on.

[Jan. 19, 9:31 p.m. | Update |: One of the Muller papers, “A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011,” has passed peer review and has been published in a new journal, Geoinformatics and Geostatistics.]

In the Times article, he’s thrown down a gauntlet to remaining skeptics, demanding that they find any alternate explanation for the measured warming:

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does.

After the first round of papers went online last fall, some climate scientists, while put off by Muller’s past diatribes and self-promotional zeal, were mildly enthusiastic (see Gavin Schmidt here).

But others, notably the climate modeler William Connolley through his Stoat blog, have dismissed Muller’s work — old and new — as “rubbish.”

It’s particularly notable that one collaborator on the first batch of papers, Judith Curry of the Georgia Institute of Technology, declined to be included as an author on the new one. I learned this when I sent her this question by e-mail:

Do you share Rich’s extremely high confidence on attribution of recent warming to humans…?

Here’s Curry’s reply:

I was invited to be a coauthor on the new paper. I declined. I gave them my review of the paper, which was highly critical. I don’t think this new paper adds anything to our understanding of attribution of the warming…. I really like the data set itself. It is when they do science with it that they get into trouble.

Curry also sent this note, which she is distributing to other journalists:

The BEST team has produced the best land surface temperature data set that we currently have. It is best in the sense of including the most data and extending further back in time. The data quality control and processing use objective, statistically robust techniques. That said, the scientific analyses that the BEST team has done with the new data set are controversial, including the impact of station quality on interpreting temperature trends and the urban heat island effect. Their latest paper on the 250-year record concludes that the best explanation for the observed warming is greenhouse gas emissions. Their analysis is way oversimplistic and not at all convincing in my opinion. There is broad agreement that greenhouse gas emissions have contributed to the warming in the latter half of the 20th century; the big question is how much of this warming can we attribute to greenhouse gas emissions. I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming. That said, I think there are two interesting results in this paper, regarding their analysis of 19th century volcanoes and the impact on climate, and also the changes to the diurnal temperature range.

From my perspective as a longtime, but lay, analyst of climate science, my sense is she has it right. The data-sifting methods of the Berkeley project, largely developed by a brilliant data analyst, Robert Rohde (there’s more on him here), have clearly added value to longstanding efforts to clarify temperature trends across a variegated planet. But the conclusions Muller describes now do seem overly simplistic (as Curry, Connolley and others say).

In an e-mail Sunday morning, Curry said she’ll be posting a long analysis of the new paper later today written by Steven Mosher and Zeke Hausfather, who have examined the Berkeley work on her blog before. [ Here’s the post.]

It appears that Muller has pushed to get the new findings submitted now because Tuesday is the deadline for journal submission for research to be considered in the next climate science report from the Intergovernmental Panel on Climate Change.

If the Berkeley analysis turns out to have been rushed or its conclusions poorly supported, you’ll quickly see opponents of limits on greenhouse gases join Connolley’s “rubbish” chorus — and, once again, it’ll be clear that science alone is unlikely to break the political blockades over this issue. WattsUpWithThat blogger Anthony Watts, who criticized Muller last year as rushing flawed work, also appears poised to weigh in this afternoon.

Setting aside questions about the robustness of Muller’s scientific conclusions, there is certainly plenty of evidence to support his article’s capping point, which is that clarifying the scientific picture of a human-heated planet is relatively easy compared to weighing responses to this long-established reality:

I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

The aspects of the climate problem that make it “beyond super wicked” mostly lie outside the realm of science.

These are updates in reverse chronological order (newest at the top):



The climate blogger known as Eli Rabett is not kind to the new study’s method for establishing a human driver for recent warming, describing the “overwhelming naivety of their attribution studies.” Here’s an excerpt:

While it is true that public discussion focuses on the global temperature anomaly, this is, at best, a crude and not very informative axe that BEST [the Berkeley team] is wielding, a tool that you can pick up in any blog (see Wood for Trees for the franchised version). It ignores the spatial structure of the forcings and the observations which are necessary for real attribution. Although it notes some, thus missing the trees (what kind are they, how old, condition), for the forest (look, great Birnam wood to high Dunsinane hill shall come). OK, it did not make much difference to MacBeth in the end that the trees were fir, dead is dead, but in trying to understand the past, present and future of the earth linking cause and effect requires more than a single global parameter. In particular the proud abhorrence of using models, to validate observations and explain correlations, the forest level comparisons, ignoring anything at a finer level than global land surface temperature anomalies, drives a huge stake through the paper….

Here’s the rest. As I noted Sunday, there’s every chance in this arena of being right for the wrong reasons.

Judith Curry has posted a piece explaining why she did not add her name to the new Berkeley temperature project paper. Here’s one overarching point:

No one that I listen to questions that adding CO2 will warm the earth, all other things being equal. The issue is whether anthropogenic activities or natural variability is dominating the climate variability. If the climate shifts hypothesis is correct (this is where I am placing my money), then this is a very difficult thing to untangle, and we will go through periods of rapid warming that are followed by a stagnant or even cooling period, and there are multiple time scales involved for both the external forcing and natural internal variability that conspire to produce unpredictable shifts. Maybe the climate system is simpler than I think it is, but I suspect not. I do know that it is not as simple as portrayed by the Rhode, Muller et al. analysis.

Make sure to read the rest.

At Judith Curry’s blog, you can now read a semi-independent detailed analysis of the new Berkeley paper by Steven Mosher and Zeke Hausfather. Both are participating in the Berkeley Earth Surface Temperature Project but say this analysis reflects their personal views, not those of the project team.

Today the Berkeley Earth Surface Temperature Project released a major update to their temperature data. The update includes:

The new paper, “A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011,” has been posted, along with extensive related material, including a list of responses to frequent questions, including these:

Do Judith Curry and Richard Muller disagree?

What is new about the statistical approach used?

How does Berkeley Earth differ from other global temperature estimates?

Why didn’t Berkeley Earth wait for peer review?

Elizabeth Muller of the Berkeley temperature project has responded to several questions I asked about their work. Here are the questions:

1) What’s the status of the four papers that were submitted last fall (accepted, in review…etc?) 2) There can be perils when publicity precedes peer review. Are you all confident that the time was right to post the papers, including the new one, ahead of review? Presumably this has to do with Tuesday deadline for IPCC eligibility? 3) Judith Curry says she provided very critical comments for the new paper and declined to be an author (the prime concern, and that of William Connolley, being that curve fitting alone is nowhere near sufficient for attribution). Is the team confident her worries are unfounded?

Here’s her reply:

All of the articles have been submitted to journals, and we have received substantial journal peer reviews. None of the reviews have indicated any mistakes in the papers; they have instead been primarily suggestions for additions, further citations of the literature. One review had no complaints about the content of the paper, but suggested delaying the publication until the long background paper, describing our methods in detail, was actually published. In addition to this journal peer review, we have had extensive comments from other scientists based on the more traditional method of peer review: circulation of preprints to other scientists. It is worthwhile remembering that the tradition in science, going back pre World War II, has been to circulate “preprints” of articles that had not yet been accepted by a journal for publication. This was truly “peer” review, and it was very helpful in uncovering errors and assumptions. We have engaged extensively in such peer review. Of course, rather than sending the preprints to all the major science libraries (as was done in the past), we now post them online. Others make use of arXiv. This has proven so effective that in some fields (e.g. string theory) the journalistic review process is avoided altogether, and papers are not submitted to journals. We are not going to that extreme, but rather are taking advantage of the traditional method. We note that others in the climate community have used this traditional approach with great effectiveness. Jim Hansen, for example, frequently puts his papers online even before they are submitted to journals. Jim has found this method to be very useful and effective, as have we. As Jim is one of the most prominent members of the climate community, and has been doing this for so long, we are surprised that some journalists and scientists think we are departing from the current tradition. The journal publication process takes time. This fact is especially true when new methods of analysis are introduced. We will be posting revised versions of 3 of the 4 papers previously posted later today (the 4th paper has not changed significantly). The core content of the papers is still the same, though the organization and detail has changed a fair amount. The new paper, which we informally call the “Results” paper, has also undergone journal peer review (and none of the review required changing our results). We are posting it online today as a preprint, because we also want to invite comments and suggestions from the larger scientific community. I believe the findings in our papers are too important to wait for the year or longer that it could take to complete the journal review process. We believe in traditional peer review; we welcome feedback the public and any scientists who are interested in taking the time to make thoughtful comments. Indeed, with the first 4 papers submitted, many of the best comments came from the broader scientific community. Our papers have received scrutiny by dozens of top scientists, not just the two or three that typically are called upon by journalists. We also believe in full transparency, which is why we are posting our data and programs – even before our results have been formally published in a journal. We would love for other people to get into the data and analysis – the sooner the better. Again, we think the results are important, and they need to be looked at sooner rather than later. Regarding Judith Curry, there is broad general agreement that the results released today give a new and improved estimate of the global land temperature going back 250 years. Judith also agrees that the findings on volcanoes and changed to the diurnal temperature range (both discussed in the results paper) make useful contributions to the field. The disagreement comes only over Berkeley Earth’s use of a simple model fitting the temperature record for the past 250 years to human CO2 emissions and volcanoes to conclude that the best explanation for the observed warming is greenhouse gas emissions. The match between the data and the theory doesn’t prove that carbon dioxide is responsible for the warming, however, it does mean that any alternative explanation should do as well or better. While the Berkeley Earth team values the simplicity of the model (indeed, in physics the simple model is generally considered the best), Curry is not convinced and thinks it is overly simplistic. These sorts of disagreements are common among scientists and contribute usefully to advancing science.

Anthony Watts has posted a package at his blog on a new paper (like Muller’s pre-publication) concluding that United States temperature trends in recent decades “are spuriously doubled, with 92% of that over-estimation resulting from erroneous NOAA adjustments of well-sited stations upward. The paper is the first to use the updated siting system which addresses USHCN siting issues and data adjustments.” This is meant as a challenge to Muller’s conclusions, given that much rests on the substantial warming in recent decades.

[* Updated at asterisk.]

In case you missed it last year, here’s an animated timeline of the study’s temperature data from 1800 on (the record now goes back to 1753):