I’ve grown accustomed to it over the years. I’ve never been fully convinced that Global Warming was really happening, and I’ve never concealed my opinions to that effect. As a result, I’ve been called any number of derogatory epithets, leading off with “denier,” a throwback to the Germans (and a few others) who steadfastly refused to believe that the Third Reich was systematically eliminating Jews, gays, and other human beings they considered undesirable. But questioning GW is markedly different from denying the Holocaust; there are solid, massive amounts of evidence, including eyewitness accounts, to document Hitler’s Final Solution.
Global Warming, on the other hand, is supported by far less real, tangible evidence, and much of the theory hangs on the predictions of computer models, which in most sciences are a useful tool, but because they can and do tend to be imprecise, only on a theoretical level. Indeed, one such model, Michael Mann’s infamous “Hockey Stick” graph proved to be so distanced from reality as to become a world wide laughingstock. And, of course, we GW deniers aren’t denying (and thus excusing) genocide. As I said, there’s a difference.
But the real difference lies in the reality of what has been transpiring in Earth’s atmosphere. For 15 years, nearly the past two decades, the atmosphere has not warmed, even though carbon dioxide released into it is increasing; an estimated 100 billion tons of CO2 were added to the atmosphere between 2000 and 2010.
Now, before all you true believers rise en masse to crucify me, let me assure you that it is not my intention to flat-out deny the possibility of GW. Just as I believe there is not enough good evidence to confirm its existence beyond the shadow of a doubt, so too, one cannot at this point say, “That’s it, folks, move on; nothing to see here,” like a cop at an accident scene. What does seem to be happening, however, is that there is now, on the part of some scientists, a modicum of doubt, at least as to the potential severity of the GW phenomenon; a doubt that, for many of these scientists and government officials, did not exist until now. Concurrent with the emergence of that doubt, we are now seeing some of the scientists themselves taking a second look at their computer models and other data, seeking explanations for what at the moment, are puzzling anomalies that are not congruent with much of the thinking until now. Even NASA’s James Hansen, one of the chief standard bearers for Global Warming theory, has observed publicly, “The five-year mean global temperature has been flat for a decade.”
The question of what is actually going on is centered on what the scientists term, “Climate Sensitivity,” which, as its name implies, simply refers to how much (or how little) the climate will react to changes in CO2 levels over time. According to the UK’s venerable conservative (in the American sense) journal, The Economist,
This is usually defined as how much hotter the Earth will get for each doubling of CO2 concentrations. So-called equilibrium sensitivity, the commonest measure, refers to the temperature rise after allowing all feedback mechanisms to work (but without accounting for changes in vegetation and ice sheets).
The rule of thumb for the effect that CO2 has on the atmosphere’s temperature has been that each doubling of the amount of absorbed carbon dioxide in the atmosphere will result in roughly a 1 degree Celsius rise in its temperature. There are, however, other variables which complicate this theory. As The Economist notes, these variables complicate predictions for two reasons:
One is that rising CO2 levels directly influence phenomena such as the amount of water vapour (also a greenhouse gas) and clouds that amplify or diminish the temperature rise. This affects equilibrium sensitivity directly, meaning doubling carbon concentrations would produce more than a 1°C rise in temperature. The second is that other things, such as adding soot and other aerosols to the atmosphere, add to or subtract from the effect of CO2. All serious climate scientists agree on these two lines of reasoning. But they disagree on the size of the change that is predicted.
It’s worth noting that the Intergovernmental Panel on Climate Change (IPCC), acknowledged as the orthodox expert on the subject, has set its standard for the above change at 3 degrees Celsius.
And now, as a result of the flat lining of global warming over the past decade-plus, scientists have returned to their data, looking for explanations for this anomalous behavior on the part of the atmosphere; new predictions have emerged, which vary significantly. A team of researchers at the University of Oslo has proposed that a doubling of CO2 will produce an increase in temperature of only 1.2-2,9 degrees with a likely number at 1.9 degrees. The group ascribes a 90 percent probability to that figure, but their work has not yet been peer reviewed. A scientist by the name of Julia Hargreaves at the Research Institute for Global Change in Yokohama, Japan, offers a range of 0.5-4.0°C, with a mean of 2.3°C.
Another factor complicating the accuracy of scientist’s predictions is the computer models themselves. Always a factor in casting doubt in the minds of doubters, usually for the reason that the models are only as good as the data put into them (known as GIGO–Garbage In, Garbage Out — in programmer circles), the computer models now are so numerous that they are divided into two categories. The first of these, known as General Circulation Models (GCMs) use a bottom-up model, relying on enormous amounts of data to predict changes. Based as they are on terabytes of data, their proponents point to the complexity comprehensiveness of these models as positive. However, this type of model does not work with real time numbers; they plot long term activity by the use of simulations rather than real figures.
The second category of computer models, known as energy balance models, are simpler and work from a top-down framework. This group, by virtue of its basic setup, is over-simplified and thus not believed to be as accurate as the GCMs. For obvious reasons, scientists are searching anew for other data and sources to supplant the data-rich but largely hypothetical computer models. Other problems which could be contributory to the apparent discrepancy between predictions and reality, include the presence of aerosols (including soot) in the atmosphere. Though aerosols are universally believed to be harmful, the ability to measure them accurately and base predictions on the measurements is only just reaching the necessary accuracy levels.
There are a number of theories being postulated for why the model predictions have been so far off target; some blame lack of in-depth knowledge on the effects various element have on the atmosphere, while others blame the technology, noting that we can’t yet measure ocean temperatures to the degree necessary for accurate work, especially at extreme depths. Even natural (as opposed to man-made) factors are now being considered as having a greater effect on the atmosphere than previously thought.
One of the results of the lack of recent warming can only be good; it appears to be encouraging a much wider-ranging dialogue, not only among scientists, but between governments and legislative bodies as well. As more and more comprehensive data is collected and presented, we will all benefit from the results.
It’s also worth noting that, as the UK’s Financial Post points out, “According to a Pew report released earlier this month, among Americans global warming ranks last among 21 public policy priorities that the government should deal with. European polls show similar results. “
And who knows? We “deniers” may turn out to be right after all.Powered by Sidelines