Willie Soon is a name that pops up every so often in climate ‘debate’. He was the lead author on the Soon and Baliunas (2003) paper (the only paper that has ever led to the resignation of 6 editors in protest at the failure of peer-review that led to its publication). He was a recent speaker (from 37.20) at the 2011 Heartland Institute conference, and can be counted on to produce a contrarian take on any particular issue that anyone might care about – ranging from climate, to mercury in fish and polar bear population dynamics.



Recently, there has been a renewed focus on how much money Willie Soon has taken from fossil fuel companies for his ‘research’ (over a $1 million dollars). While that is impressive, the real issue is not how he gets paid, but the quality of his science. The discussion last week about his finances did lead people to notice his publically accessible website where he has posted papers, emails, calculations and reviews going back to 2003 (Update: since this piece was posted, many items have been blocked. We have switched the links to saved versions). There is quite a lot of interesting stuff there, including a few curious tidbits.



Figure: Distinct populations of polar bears across the Arctic. WH = Western Hudson Bay, nr. Churchill, Manitoba.

One particularly amusing find is evidence of some outrageous cherry-picking in what ended up as the Dyck, Soon, Baydack, Legates, Baliunas, Ball and Hancock (2007) paper. This paper attempted to cast doubt on the sensitivity of polar bears in Western Hudson Bay to climate change, a basis of the eventual US Fish and Wildlife listing of the polar bear as ‘threatened’. Earlier work by Andy Derocher, Ian Stirling and others had documented the clear reduction in sea ice in Hudson Bay and the subsequent reduction in the period available for hunting seals, and the impacts on the population.

Note that as a “viewpoint” paper, the Dyck et al submission was not peer-reviewed. Instead, it was accepted (March 2 2007) only 1 day after it was received (March 1 2007). Soon’s website indicates that a very similar paper had been submitted as a normal paper at least once before (in 2003, with only Dyck and Soon as authors). The reviews for that paper (or one very similar), are also available (dated June 2003).

The paper itself (both versions) is a collection of standard arguments for why everything is uncertain and nothing can be concluded, but did actually include a little analysis. Specifically, the claim was made that temperatures in Churchill, Manitoba (close to the center of the Western Hudson Bay population of bears) had not risen, and that instead, any multidecadal variations in temperatures affecting the bears were related to the Arctic Oscillation (AO), a mode of natural variability. Of course, temperatures in the Churchill region have risen, and the ice extent in Hudson Bay is melting earlier and forming later (by about a month in each case) than 30 years ago. But the interesting aspect is the impact of the AO which certainly affects short term temperatures in the Arctic.



Regressions of winter temperature anomalies with the strength of the AO (JISAO). (ºC change in temperature per unit increase in the AO – for reference, the AO varies from roughly -3 to 3 on a monthly timescale).

Andrew Derocher, who signed his review, queried why the figure showing an impact of the AO on temperature (r2=0.52) used the data from Frobisher Bay (Iqaluit) in the Labrador Sea instead of the data from Churchill. Frobisher Bay is just under 1000 miles away from Churchill and doesn’t border Hudson Bay at all, so its relevance to bears in Western Hudson Bay is somewhat mysterious. It is however very close to the center of the AO influence (as seen in the above figure). Derocher suggested that Dyck et al use the correlations at Churchill instead (makes sense, no?). In the finally published paper (in 2007) however, the correlation with the AO was still using the Frobisher Bay data – exactly as it appears in the first draft in 2003.

What the files reveal however, is that Soon had already calculated the correlations of Churchill temperatures to the AO, and found that the correlation was very low – regardless of what month or season he used (the files are dated to January 2003 – prior to Derocher’s review). None of the correlations showed an r2 > 0.24 (highest in August), and most were much smaller (especially during the key spring period where the variance explained was less than 5%). Note that a value like r2=0.24 is not necessarily meaningless — indeed, for the number of data points involved here (between 50 and 60), this is probably statistically significant relative to a standard ‘red noise’ null hypothesis. However, the variance explained is small.

Soon had also calculated the impact of the AO on the Frobisher Bay data and, unsurprisingly, used the seasonal correlation that had the highest correlation. The fact that Frobisher Bay temperatures and Churchill temperatures are only loosely correlated (also calculated by Soon) (highest monthly r2 was 0.22) was not mentioned in either version of the paper.

The link is made (in the 2003 version) using:

… the temperature and climatic conditions around the Hudson Strait and Hudson Bay areas have close association with the AO circulation index

which is an attempt to imply that the Hudson Strait connection, also applies to Hudson Bay (which Soon already knew was untrue). The version in the 2007 paper was only slightly different:

… the air temperature and climatic conditions around the Hudson Strait and Hudson Bay areas have a close association with the AO circulation index.

but is equally misleading.

So, the picture here is quite clear. Soon knew that the relevant data series for discussing the AO influence on Western Hudson Bay temperature (and by proxy, sea ice) was from Churchill and despite being reminded of the fact by the first set of reviewers, nonetheless continued to only show the AO connection to a site 1000 miles away, which had a much higher correlation without any discussion of whether this other data was at all relevant to Churchill or the bears nearby.

There was much else in the Dyck et al paper that was worth criticising (see Stirling et al (2008) for details), but the evidence of the cherry-picking of data for the sake of an (irrelevant) higher correlation from the files is a very clear black flag.

In my opinion, this kind of ‘scientific’ sleight-of-hand is far more egregious than Soon’s ability to get funding from coal, oil, and fossil-fueled foundations.