I’ve been having an e-mail discussion with another researcher who publishes on the subject of climate feedbacks, and who remains unconvinced of my ideas regarding the ability of clouds to cause climate change. Since I am using the simple forcing-feedback model as evidence of my claims, I thought I would show some model results for a 1,000 year integration period.

What I want to demonstrate is one of the issues that is almost totally forgotten in the global warming debate: long-term climate changes can be caused by short-term random cloud variations.

The main reason this counter-intuitive mechanism is possible is that the large heat capacity of the ocean retains a memory of past temperature change, and so it experiences a “random-walk” like behavior. It is not a true random walk because the temperature excursions from the average climate state are somewhat constrained by the temperature-dependent emission of infrared radiation to space.

A 1,000 Year Model Run

The temperature variability in this model experiment is entirely driven by a 1,000 year time series of monthly random numbers, which is then smoothed with a 30-year filter to mimic multi-decadal variability in cloud cover.

I’ve run the model with a 700 m deep ocean, and strong negative feeedback (6 Watts per sq. meter of extra loss of energy to space per degree of warming, which is equivalent to only 0.5 deg. C of warming for a doubling of atmospheric CO2. This is what we observed in satellite data for month-to-month global average temperature variations.)

The first plot below shows the resulting global average radiative imbalance, which is a combination of (1) the random cloud forcing and (2) the radiative feedback upon any temperature change from that forcing. Note that the standard deviation of these variations over the 1,000 year model integration is only one-half of one percent of the average rate at which solar energy is absorbed by the Earth, which is about 240 Watts per sq. meter.



I also computed the average 10-year trends for all 10-year periods contained in the 1,000 year time series shown above, and got about the same value as NASA’s best radiation budget instrument (CERES) has observed from the Terra satellite for the ten-year period 2000 – 2010: about 1 Watt per sq. meter per decade. Thus, we have satellite evidence that the radiative imbalances seen above are not unrealistic.

The second plot shows the resulting temperature changes over the 1,000 year model run. Note that even though the time scale of the forcing is relatively short — 30 year smoothed monthly random numbers — the 700 m ocean layer can experience much longer time scale temperature changes.



In fact, if we think of this as the real temperature history for the last 1,000 years, we might even imagine a “Medieval Warm Period” 600 years before the end of the integration, with rapid global warming commencing in the last century.

Hmmm…sounds vaguely familiar.

The main point here is that random cloud variations in the climate system can cause climate change. You don’t need a change in solar irradiance, or any other external forcing mechanism.

The above plots also illustrate the danger in comparing things like sunspot activity (and its presumed modulation of cloud cover) to long-term temperature changes. As you can see, the temperature variations in the second plot look nothing like the global energy imbalance variations in the first plot. This is for two reasons: (1) forcing (global radiative imbalance) due to cloud variations is related to the time rate of change of temperature….not to the temperature per se; and (2) the ocean’s “memory” of previous forcing leads to much longer time scale temperature behavior than the short-term cloud forcing might have suggested.

The fact that climate change can be caused by seemingly random, short-term processes has been totally lost in the climate debate. I’m not sure why. Could it be that, if we were to admit the climate system can vary in unpredictable ways, there would be less room for our egos to cause climate change?



