Guest essay by Steven Burnett

Most of my income is derived from tutoring, with part being tied into the Google helpouts system. One of my most loyal customers for my physics and mathematics tutoring sent me a link to a $10,000 reward challenge for skeptics. Which is now up to $30,000, seen here.

Below is what I wrote back with minor edits. While I could have added more links, or graphs, I feel that this synopsis is the most compact skeptic’s case, without dropping off too many details. Perhaps I should submit it for $30,000.

These kinds of challenges pop up all the time here’s one for creationism:

http://www.huffingtonpost.com/2013/03/27/joseph-mastropaolo-creationist-10000-disprove-genesis_n_2964801.html

The problem with the climate change challenge is that no one is denying that there is likely an anthropogenic signal. The question is how much.

This article probably offers one of the better overviews of the issue

http://www.washingtonpost.com/blogs/capital-weather-gang/wp/2014/06/20/global-warming-of-the-earths-surface-has-decelerated-viewpoint/

You can demonstrate in a lab that CO2 absorbs IR wavelengths within the same range as the earth gives off. The data shows that the northern hemisphere has had a shift in the mean temperature since before the industrial revolution.

Overwhelmingly the people pushing the issue like to try to box skeptics in by presenting it as an all or nothing issue, which those who don’t read skeptic statements take on faith. In reality the skeptic’s side has a much larger range of stances on the issues, I have tried to bold them out. There are some people who make ridiculously stupid claims that there is no anthropogenic signal but it’s a very small minority. Many skeptics feel that they are just internet trolls, we try not to feed them.

A better way to examine skeptics is to look at them as scientific critics, and more specifically to evaluate the criticisms as issues with each step of the scientific method in climate science fields. The standard scientific method goes:

Observations->Hypothesis->Experiment->Analysis. If we were to go back to the ’90’s Then we could state that this was doled out as…

Observations: A warming/CO2 concentration correlation and CO2 absorbs IR spectra, the same trend in the ice core data existed,

Hypothesis: Emissions cause global warming,

Experiment: Climate Model,

Analysis: a close approximation of the hind cast, statistically significant temperature increases, hockey sticks, etc. Through the late 1990’s and early 2000’s the conclusion that global warming is real was scientifically acceptable. The next stage is usually peer review and scrutiny and this is where the theory ran into problems.

The Problems:

The observations aren’t very good prior to the late ’70’s and they get worse as we go back to the earliest records. We weren’t looking for tenths of a degree trends and we weren’t controlling our instruments for them. Stations have been moved, cities have grown etc, all of which would induce a warming bias on those stations, the data is frankly of poor quality. But there are other problems that came up. The ice core data, using better instrumentation, actually shows CO2 lagging behind temperature changes. More importantly temperatures stopped rising on all data sets, but CO2 levels continued their upward expansion. We also have not detected the “hot spot” that was sure to be proof of an anthropogenic contribution.

Our Hypothesis for how the climate system operates is essentially coded into the climate models. There are 114 of them that are used by the IPCC all of them are going up, all of them are rising faster than observations and most of them have been falsified. But the work falsifying them is very recent.

This paper falsifies the last 20 years of simulations at the 90% confidence interval, within that blog entry you can find another paper that falsifies them at the 98% confidence interval. That paper cites a third that falsified at the 95% confidence interval. What this means is that there is less than a 10%, 2% or 5% chance that the models are wrong by chance.

As the models are mathematical representations of how we think the climate system operates, that means climate scientists were wrong. In response there has been a flurry of activity attempting to attribute the pause and explain it, but explaining a problem with mathematical models after it occurred and claiming that your hypothesis will still be born out requires its own period of time to validate.

There are well over 10 different attributions at this point for the pause, most of them entirely explaining it away. This means the pause in its current state is over explained and more than one of those papers is wrong. In science we aim for a chance at being wrong of 5%, apparently climate science gets a pass.

This divergence also has impacted the metric known as climate sensitivity. Climate sensitivities note a thermal increase of 1.2-1.5 C per doubling of CO2, as in if we went form the current 400ppm CO2 to 800PPM CO2 the temperatures would rise approximately 1.2-1.5 degrees, climate simulations produce much higher results typically between 3 and 4.5 degrees per doubling.

Digging even further there are major issues with the models, our experiments, themselves. Depending on the compiler, operating system and even hardware modeling output can change due to rounding errors. They can’t predict clouds. Nor do they have the resolution to see many of the atmospheric processes that transport heat. But that’s only the climate models themselves. The impact models, or integrated assessment models have almost no data upon which to base their claims. That’s why the IPCC stated that the costs of climate change for 2.5 degrees of warming range from .2%-2% of world GDP. This is again for predictions almost 100 years in the future, which at this time are untestable and unfalsifiable.

When evaluating the cost of a ton of CO2 emissions the integrated assessment models depend very heavily on the discount rate. The current administration cites a discount rate of 3% at 37$ per ton but the most appropriate discount rate, and the one which long term assessments are supposed to be performed at is 7%. This means the actual social cost of carbon is about 4-5$ per ton. As was discussed in an EPW senate hearing, the current administration has failed to produce the 7% discount rate as is required and instead produced 5%, 3% and 2.5% rates. The reason the cost varies so much is partially due to uncertainty in the integrated assessment models which are fed the outputs of the GCM’s. A common coding colloquialism is GIGO garbage in garbage out.

But the worst issue is the analysis. As part of the attempts to preserve the theory there have been some gross statistical practices and data torture employed. The first monstrosity to be slain was Michael Mann’s hockey stick which used a special algorithm to weight his tree samples. That was taken down by Steven McIntyre a statistician who proved not only the weighting issue, but also found the splice point of thermometer data when the proxy and thermometer temperatures diverged.

There have since been several other hockey-sticks, all of which go down as giant flaming piles of poo. Trenberth’s hockey stick, also from dendrochronology, died when it was pointed out that he was using a special sub-selection of only 12 trees, and that when his entire data set was used the hockey-stick disappeared. In 2013 Marcott et al published a hockey-stick on his graph that averaged multiple proxies except the the blade portion was generated using only about 3 of the proxies, that weren’t statistically robust, had some proxy date rearrangement issues, which coincided with the industrial revolution.

You have mentioned the 97% consensus papers which do exist but they are atrocious. Cook et al. has been rebutted several times actually I recommend further reading some of the issues that Brandon Schollenberger has pointed out as well, though it’s not peer reviewed. The earlier 97% paper by Doran and Zimmerman was equally stupid the wallstreet journal touched on both but I recommend this site for a thorough critique. Truthfully the number of abstracts and methodologies I have read that are complete garbage from this field is astounding so I’m not going to try to link them all.

The question ultimately becomes what piece of evidence is required before admitting that climate may not be as sensitive to anthropogenic emissions as once thought?

Outside of the problems with their scientific methodology there’s also some ethics issues that seem to keep cropping up. A statistician working for a left leaning think tank was just terminated because he wrote a piece about the statistically weak case for anthropogenic warming. About a month before that Lennart Bengtsson, a climate scientist tried to join a more conservative and skeptical climate change think tan. He had to resign due to threats, authors withdrawing from his papers and general concern for his safety and wellbeing.

A paper of his, focused on the discrepancy between models and observations, was rejected with the rejecting review stating they recommended it in part because they felt it might be harmful. The reviewer also mentioned that climate models should not be validated against observational data. A few years ago it was climate-gate.

A psychology paper tried to name skeptics as conspiracy nuts, when it was retracted citing ethical reasons, the researchers and their community cited it as being perfectly ok to debase your opponents and that the retraction was due to lawsuits. The clamoring defense got so antagonistic the publisher had to reinforce the rejection was due to the papers ethics violations, language and the failure or unwillingness of the authors to make changes. There’s also the paper that says lying and exaggerating results is OK.

If you read skeptical science they try to rebut skeptic claims but nine times out of ten they use strawmen, ad hominems or other logical fallacies. For instance a good argument can be made that cheap affordable fossil fuel energy can greatly improve the poorest nations of the world, and that denying them access to this resource is harmful. They rebut it by pointing out that projected climate damages, impact the poorest nations the most. That might be true but it’s not the same argument. Depriving impoverished nations of the energy they need to grow, enforcing poverty and mandating foreign dependence for 85 years, so that the poor might not have to suffer from as many storms in the future is frankly asinine.

But let’s say you’re not skeptical of the ethics, or their methodological flaws. Let’s say you decide you want avert the future risk now. You can still be skeptical of the proposed solutions. For instance let’s look at energy policy. The cheapest, most effective and simplest energy policy would be a carbon tax. Again 4$ per ton accurately prices future damages. It also allows countries and markets to work instead of hoping bureaucrats don’t screw it up. Essentially a carbon tax penalizes carbon for its actual cost instead of giving enormous power to unelected officials like what the EPA just did.

But maybe you don’t believe in markets, maybe you believe the government isn’t as incompetent as they seem to keep trying to prove. That’s fine, you can still be skeptical of how the money is being spent. In the US solar receives an unbelievable amount of market favoritism, you start by getting a 40% tax credit on every installation. Additionally while all sectors recoup their capital investments over time as the assets depreciate, solar recoups 100% of its cost in under 5 years, that’s less time than an office chair. With these perks solar is still the most expensive form of electricity generation.

When you correct for just the tax credit solar costs increase to almost 140$/mwhr for standard installations, that goes up for thermal solar. Correcting for wind’s tax credit this goes up to about 88$ MwHr. The intermittancy on the grid is an externality that should be accounted for, from there you have to factor in the degradation of wind and solar as a cost factor, which when integrated over their life span multiplies their cost by almost 4.5.

Nuclear at an approximate 96$/MWhr is substantially cheaper, has a lower life cycle carbon emission, lasts longer and is safer. But we only hear about improving renewable contributions when they are literally worse in every way.

These are issues that we can have with the scientific authenticity of the theory. The next step would be falsification, but It’s difficult to find a piece of falsifying evidence. No matter what happens now or in the future global warming/climate change seems to predict it. We have both warmer and cooler spring/summer/fall/winter. There’s droughts and floods, cold/hot. Literally in 2009/2010 we were hearing how climate change will totally cause more snow in winter when just a few years before it was the end of snowy winters. Hell that was four years ago and they were still blaming crappy Olympic conditions on global warming this year, ignoring entirely that the average temperature for that part of Russia in February is above freezing throughout the whole damn record.

We have almost 2 decades of no temperature trend, and a net negative for a little over a decade. That’s apparently not enough. There are periods within this interglacial that have been warmer, and periods that have been cooler. So what is the reference period of a climatic normal? A few hundred years ago temperature spiked without greenhouse gas emissions, the period of 1914-1940 showed a similar rate and trend as the 1980-200 period, why is the latter anthropogenic and the former not? How is CO2 the driving force this time when there is scant to no evidence that it has ever been the major force in the past?

Why should we believe the corrections or explanations for the pause, the same individuals hyping them were the same ones pointing out how perfect the models were just a few years ago? there is no mechanism by which heat remitted in the lower atmosphere magically descends to the deepest layer of the ocean. the one we just started to measure a few years ago and from which there still aren’t reliable measurements, also the region that is bounded by warmer upper oceans and geothermal heating. How does it get there without being detectable in the upper atmosphere, lower atmosphere or upper oceanic level? Nor is there a mechanism to describe how it could possibly all concentrate in the arctic where we also don’t have any measurements.

Where do we draw the scientific line between natural or artificial trends, and how do we know that line is accurate? Why shouldn’t climate science be required to validate? What is falsifying evidence? Faced with the mountain of problems surrounding uncertainty, poor methodology, awful ethics and analysis, most skeptics, myself included just call the whole thing bullshit.

Share this: Print

Email

Twitter

Facebook

Pinterest

LinkedIn

Reddit



Like this: Like Loading...