This HuffPost Canada page is maintained as part of an online archive.

The $100,000 Giant Climate Fluctuation

The GNF theory falls apart when tested against multiproxy temperatures spanning the four centuries from 1500 to 1900. Multiproxy temperatures are estimated from data obtained from thousands of tree rings, bore holes, ice cores, lake sediments, pollen and other indicators of past temperatures.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Industrial pollution nature disaster concept, double exposure.
TanawatPontchour via Getty Images
Industrial pollution nature disaster concept, double exposure.

While the scientific community has increasingly implicated human actions -- especially greenhouse gas emissions -- for the industrial epoch warming, the skeptics have continued to denigrate the numerical models (GCMs) and complain that the data is biased. A typical skeptic explanation adorns a "Friends of Science" billboard: "The sun is the main driver of climate change. Not you. Not CO2." thus exhorting us to believe that global warming is simply a (solar-induced) giant natural fluctuation (GNF).

Up until now, skeptics have only invoked GNFs: they have refrained from proposing concrete GNF models, making their position difficult to criticize. All this changed last November thanks to the publication of a sample of 1,000 series from a "trendless statistical model, which was fit to a series of global temperatures" by British statistician and skeptic Doug Keenan.

Following Keenan's 2014 presentation to the British House of Lords, the model outputs were proffered as part of a statistical brainteaser with a $100,000 prize, purportedly demonstrating "... that virtually all claims to have drawn statistical inferences from climatic data are untenable. In particular, there is no demonstrated observational evidence for significant global warming."

Now, in Geophysical Research Letters (July, 2016), our research group at McGill University has published a detailed analysis of Keenan's model showing that while it may indeed be compatible with the globally, annually averaged industrial temperatures since 1880, that it is quite incompatible with our extensive knowledge of preindustrial temperatures.

In particular, the GNF theory falls apart when tested against multiproxy temperatures spanning the four centuries from 1500 to 1900. Multiproxy temperatures are estimated from data obtained from thousands of tree rings, bore holes, ice cores, lake sediments, pollen and other indicators of past temperatures.

Our analysis shows that the variability of Keenan's model over periods of centuries is so strong that even mild extrapolations imply that the earth would go in and out of an ice age roughly every 1,000 instead of 100,000 years. What's more, while the typical global-scale, preindustrial century to century temperature change is about 0.20 degrees Celsius, in Keenan's model it is three to five times larger (depending on which variant of the model is considered). Finally -- for those interested in the money -- in a supplement, the paper shows how to (nearly) win the contest.

Ironically, even at inception Keenan's contest was superfluous: in 2014 our group at McGill had already published an analysis showing that the probability of the industrial epoch warming being a GNF was less than 0.1 per cent. The 2014 analysis even generously took into account the possibility of extreme "black swan" temperature fluctuations that had never before been considered: in comparison, conventional "bell curve" statistical assumptions yield probabilities of less that one in a million.

Keenan aimed to publicize his criticism of a specific uncertainty analysis assumption adopted by the International Panel on Climate Change (IPCC) in its Fifth Assessment Report (2013). The technical issue is whether one assumes that the residuals (what's left over after a trend has been removed from the data) have weak or strong correlations.

The IPCC adopted the conventional weak correlation assumption, whereas Keenan's model involves strong correlations, and the resulting trends (and their uncertainties) do indeed depend somewhat on this. However, the GRL paper shows that while the weak versus strong issue may be scientifically important, that the consequence of these assumptions for statistical analysis of anthropogenic warming is relatively marginal.

Helped by statistics, the GNF model can easily be scientifically rejected.

These blog piece has also been published in French on HuffPost Québec.

Follow HuffPost Canada Blogs on Facebook

MORE ON HUFFPOST:

Close
This HuffPost Canada page is maintained as part of an online archive. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.