Sometimes you come across something and wonder how you ever missed it. Hingsight makes things seem so obvious sometimes. Today’s post is about an example of this.



A couple years ago, there was controversy over a highly publicized paper by Stephan Lewandowsky – NASA Faked the Moon Landing—Therefore, (Climate) Science Is a Hoax: An Anatomy of the Motivated Rejection of Science – about skeptics being conspiracy theorists which claimed a survey had been linked to on eight blogs, one of which was Skeptical Science. Nobody could find any evidence Skeptical Science had linked to the survey (though its proprietor did post a link on Twitter). This was problematic because the supplementary material of Lewandowsky’s paper relied upon an (unpublished) analysis of Skeptical Science traffic to claim the survey was seen by a sizable number of skeptics, saying:

Prevalence of “skeptics” among blog visitors All of the blogs that carried the link to the survey broadly endorsed the scientific consensus on climate change (see Table S1). As evidenced by the comment streams, however, their readership was broad and encompassed a wide range of view on climate change. To illustrate, a content analysis of 1067 comments from unique visitors to http://www.skepticalscience.com, conducted by the proprietor of the blog, revealed that around 20% (N = 222) held clearly “skeptical” views, with the remainder (N = 845) endorsing the scientific consensus. At the time the research was conducted (September 2010), http://www.skepticalscience.com received 390,000 monthly visits. Extrapolating from the content analysis of the comments, this translates into up to 78,000 visits from “skeptics” at the time when the survey was open (although it cannot be ascertained how many of the visitors actually saw the link.) For comparison, a survey of the U. S. public in June 2010 pegged the proportion of “skeptics” in the population at 18% (Leiserowitz, Maibach, Roser-Renouf, & Smith, 2011). Comparable surveys in other countries (e. g., Australia; Leviston & Walker, 2010) yielded similar estimates for the same time period. The proportion of “skeptics” who comment at http://www.skepticalscience.com is thus roughly commensurate with their proportion in the population at large.

If a link to the survey was never posted at Skeptical Science, this unpublished, unverifiable analysis of Skeptical Science visitors would be completely irrelevant. Without it, Lewandowsky and co-authors would have nothing on which to argue any meaningful number of skeptics had been exposed to their survey.

If skeptics hadn’t taken the survey, obviously, the survey couldn’t show skeptics believe in conspiracy theories. A survey requires skeptics claiming to believe in conspiracy theories in order to say skeptics believe in conspriacy theories. That wasn’t the case. Practically nobody taking the survey claimed to be skeptical of global warming and believe in conspiracy theories.

Instead, what happened is a lot of people who accept global warming said they don’t believe in conspiracy theories. Lewandowsky took this correlation between accepting global warming and not believing in conspiracy theories as proof people who don’t global warming do believe in conspiracy theories. That sort of “correlation” is meaningless, as I’ve shown before by coming up with graphs like this one:

Which finds the exact same kind of fake “correlation” between being skeptical of global warming and believing in conspiracy theories as found in this graph:

Which finds a “correlation” between believing global warming is a serious threat and thinking pedophilia is okay. Both of these “correlations” are fake, based entirely upon not asking many people of the group you’re interested in what they believe. Instead, you ask many people of any other group what they believe in and just assume the group you’re interested in believes the opposite.

This actually violates the mathematical assumption of the correlation testing the authors use. That testing requires a “normal” distribution to be valid. In other words, you data has to be a fair data set otherwise you get spurious results. If the authors didn’t get many skeptical responses (and a sizable number of people who believe in the various conspiracy theories), the correlation tests they used would be completely invalid.

The supplementary material for the paper would have us believe there were no data problems. We can check the data to see that’s false, but we can also examine the idea the survey was ever linked to at Skeptical Science. Others have shown that claim is almost unquestionably false by looking at archived versions of the Skeptical Science website (with the Wayback Machine). Those archived versions make it nearly impossible to believe a post was published on the site then deleted, as Stephan Lewandowsky has argued and John Cook (propietor of Skeptical Science) has specifically claimed:

Skeptical Science did link to the Lewandowsky survey back in 2011 but now when I search the archives for the link, it’s no longer there so the link must’ve been taken down once the survey was over.

This overwhelming evidence, combined with ongoing correspondence with Cook and Lewandowsky, led people to accusing them of intentionally lying about this issue. It’s a pretty compelling case.

It can be made more compelling though. Skeptical Science assigns every post on its website a number. That number goes up by one each time a new post is made. To find old posts, you can use a URL like:

http://skepticalscience.com/news.php?n=330

Which will take you to a post titled, “Arctic Sea Ice: Why Do Skeptics Think in Only Two Dimensions?” That post was written on August 23, 2010. Change the number 330 to 331, and you get taken to a post titled, “Station drop-off: How many thermometers do you need to take a temperature?” That one was written on August 24, 2010.

According to Stephan Lewandowsky and John Cook, Skeptical Science posted a link to the survey on August 28, 2010. The Wayback Machine provides a convenient list of posts published in that time. That earliest post on that list is, “Station drop-off: How many thermometers do you need to take a temperature?” That was post 331. Here are the posts uploaded after it:

332: “Arctic sea ice… take 2” – August 25, 2010.

333: “Climate Models: Learning From History Rather Than Repeating It” – August 26, 2010

334: “Can humans affect global climate?” – August 26, 2010

335: “Comparing volcanic CO2 to human CO2” – August 27, 2010

336: “Ocean acidification threatens entire marine food chains” – August 28, 2010

337: “Why we can trust the surface temperature record” – August 28, 2010

338: “Human CO2: Peddling Myths About The Carbon Cycle” – August 29, 2010

339: “Sea level rise: the broader picture” – August 30, 2010

340: “Carbon dioxide equivalents” – September 1, 2010

As this list shows, there are no deleted posts in this time period. If there were, the link with the deleted post’s number would be broken. There are no broken links. That means no post was deleted. That gives conclusive proof Skeptical Science did not create a post to direct people to the survey for Lewandowsky’s paper.

It’s worth noting there is a post in the Wayback Machine’s snapshot not present in that list. It’s titled, “Hansen etal hit a Climate Home Run — in 1981.” That post doesn’t show up in our list because it is post number 328. The reason it shows up out of order is likely sometimes Skeptical Science uploads posts but doesn’t formally publish them so there is time to review what they say. The number is assigned when the post is uploaded, not when it formally goes live.