In the realm of climate science, as in most topics, there exists a range of ideas as to what is going on, and what it means for the future.
At the risk of generalizing, the gamut looks something like this: Ultra-alarmists think that human greenhouse-gas-producing activities will vastly change the face of the planet and make the earth inhospitable for humans; they therefore demand large and immediate action to curtail greenhouse gas emissions.
Alarmists understand that human activities are changing the earth’s climate and think that the potential changes are sufficient to warrant some pre-emptive action to try to mitigate them.
Skeptics think that humans activities are changing the earth’s climate but, by and large, they think that the changes are not likely to be terribly disruptive (and even could be, in net, positive) and that drastic action to curtail greenhouse gas emissions is unnecessary, difficult, and ineffective.
Ultra-skeptics think that human greenhouse gas-producing activities are impacting the earth’s climate in no way whatsoever.
Most of my energy tends to be directed at countering alarmist claims about impending climate catastrophe, but the scientist in me gets just as bent out of shape about some of the contentions made by the ultra-skeptics, which are simply unsupported by virtually any scientific evidence. Primary among these claims is that human activities are not responsible for the observed build-up of atmospheric carbon dioxide. This is just plain wrong.
We have good measurement of how much carbon dioxide is building up in the atmosphere each year, and we have good estimates of how much carbon dioxide is being emitted from human activities each year. It turns out that there are more than enough anthropogenic emissions to account for how much the atmosphere is accumulating. In fact, the great mystery concerns the “missing carbon,” that is, where exactly is the extra carbon dioxide going that is emitted by humans but that doesn’t end up staying in the atmosphere. (Only about half of the human CO2 emissions end up accumulating in the atmosphere; the rest end up somewhere else—in the oceans, in the terrestrial biosphere, etc.)
In my opinion, it would be much more useful for folks interested in the carbon cycle to try to better understand the behavior of the CO2 sinks and how that behavior may change in the future (if at all) rather than in trying for come up with sources of CO2 other than human activities to explain the atmospheric concentration growth—as it is, we already have too much, not too little.
What this means is: The argument that the increase in atmospheric CO2 results from a natural temperature recovery from the depths of the Little Ice Age in the mid-to-late 1800s just doesn’t work.
In fact, all lines of evidence are against it.
This argument has its foundation in the carbon-dioxide and temperature trends of the past 400 to 600 thousand years, which we know from air bubbles trapped in ice that has been extracted from ice cores taken in Antarctica and Greenland. Basically, the data from the ice cores show that periods when the earth’s climate has been warm are also periods when there have been relatively higher CO2 concentrations (Figure 1).
Al Gore uses this to say that the higher CO2 caused the higher temperatures; ultra-skeptics counter by pointing out that, if you look closely enough, you’ll see that the temperature rises before the CO2 rises, so rising temperatures cause rising CO2, not vice versa.
The fact is that both interpretations are correct—rising temperatures led to rising CO2, which led to more rising temperatures. But the only relevance that this has to the current situation is that this natural positive feedback between temperature variations and CO2 variations didn’t run away in the past, and so we shouldn’t expect it to run away now. It carries no relevance as to what is causing the ongoing increase in atmospheric CO2 concentrations.
But anyone who looks at the data (shown in Figure 1) will see that no matter which caused the other, the changes in temperature from ice-age cold to interglacial warmth are about 10ºC while the change in CO2 is about 100ppm. Since the late 1800s, the temperature has warmed a bit less than 1ºC , while the CO2 concentration has increased by a bit less than 100ppm. In other words, the natural, historical relationship between CO2 and temperature is about 10 times weaker than that observed over the past 100 or so years. Thus, there is no way that the temperature rise from the Little Ice Age to the present can be the cause of an atmospheric CO2 increase of nearly 100ppm—the reasonable expectation would be about 10ppm. This line of reasoning is off by an order of magnitude.
And where do ultra-skeptics think the CO2 building up in the atmosphere is coming from, if not from humans? Their answer is typically “the oceans”—as the oceans warm, they outgas carbon dioxide. While this is certainly true, an opposite effect is also ongoing—a greater concentration of CO2 in the air drives more CO2 into the oceans. One way of determining how much CO2 is dissolved in the oceans is to observe the pH of the ocean waters. Long-term trends show a gradual decline in ocean pH (the source of the ocean “acidification” scare—the subject of a future MasterResource article). This means that the ocean is gaining more CO2 than it is losing. So, it can’t be the source of the large CO2 increase observed in the atmosphere.
Another way to figure out where the extra CO2 that is now part of the annual flux is coming from is through an isotopic analysis. CO2 that is released from fossil fuels carries a different (lighter) molecular weight than that which is usually part of the annual CO2 flux from land and oceans and atmosphere. CO2 released by fossil fuel has a lower 13C/12C ratio than does most other CO2 and long-term records show that the overall 13C/12C ratio in the atmosphere has been declining—an indication that an increasing proportion of atmospheric CO2 is coming from fossil fuel sources.
So there are (at least) three independent methods of determining the source of the extra CO2 that is building-up in the atmosphere, and all three of them finger fossil-fuel combustion as the primary culprit.
Yet, despite the overwhelming scientific evidence, the ultra-skeptics persist on forwarding the concept that the observed atmospheric CO2 growth is not caused by human actions. And sadly, since this notion is extremely pleasing to those folks (politicians et al.) who are actively fighting legislation aimed at limiting greenhouse gas emissions, it is widely incorporated into their stump speeches. Some even go so far as to suggest that the respiration of 6.5 (and growing) billion humans plays a role in the CO2 increases—again, pure nonsense. We humans breathe out only what we take in, and since we eat plants, which extract atmospheric CO2 for their carbon source in producing carbohydrates, it is a completely closed loop. (Now if we ate coal, or drank oil, perhaps things would be different.)
Fighting bad science with bad science is just a bad idea. There are numerous reasons to oppose restrictions on greenhouse gas emissions, but the notion that they aren’t contributing to increasing atmospheric concentrations of carbon dioxide isn’t one of them.
In future articles, if I have time between combating alarmist outbreaks, I may point out some other ultra-skeptic fallacies—such as, “The build-up of atmospheric greenhouse gases isn’t responsible for elevating global average surface temperatures” or “Natural variations can fully explain the observed ‘global warming’.”