McIntyre has sent the following comment to Revkin via email. (AR has posted it in the comments).

By framing a Climate Audit discussion as extended peer review, you can perhaps see why demands that I produce my own reconstruction are off point to the criticisms. Climate Audit is a form of extended peer review and criticism.



The criticisms that I’ve made have nearly all been the sort of questions that, in my opinion, a properly informed peer reviewer should have asked. For example, the Yamal dispute that you covered a couple of years ago, was an extended peer review issue. I asked why the authors did not use the site selection method in the Yamal area that they had used in the Avam-Taimyr area. Had a peer reviewer understood the methods well enough, he ought to have asked that question. It was a good question at the time and remains a good question.



If the authors had responded – as CRU and Real Climate did – saying , Nyah, nyah, we can get Hockey Sticks some other way – the reviewer and editor would have concluded that the authors were deranged. This sort of question should be answered on its own terms – as Karoly appears intent on doing.



The same things apply to the ongoing Mann debate. The technical “peer review” points were never answered. Instead, the topic gets changed to – We can get Sticks some other way; or Sticks don’t matter.



Although “peer review” is apparently an integral part of academic publishing, there is astonishingly little empirical study of peer reviews themselves because the documents are not available. Most academic discussions tend to be programmatic and platitudinous: that it’s flawed but it’s the best alternative.



I try to avoid generalities. In practical terms, if reliance is to be placed on academic articles for policy purposes, people need to recognize what isn’t done in journal peer review: no review of data, no detailed review of methodology. It is evident to me that academics need to accommodate “extended peer review” by archiving of data and meticulous documentation of procedures (source code is an aid to this, though too often sneered at.) The hypocrisy of academics expecting large-scale policy changes while refusing to provide data on the grounds of “intellectual property rights” is risible and deserving of the contempt of the public.



BTW the problems in Gergis et al are no worse than Mann et al 2008, which should have been retracted for its use of contaminated upside-down data for his centerpiece no-dendro reconstruction. By conceding the problem, Karoly and Gergis have suffered some loss of reputation, while Mann’s obfuscation has been extremely successful within the field. It seems unfortunate that people suffer for doing the right thing and are rewarded for the wrong thing. (But not the only place in the world where this happens.)



Cheers, Steve