A Free-Market Energy Blog

Climate Alarmism on the Hot Seat: Eric Berger, Houston Chronicle Science Writer, Wants to Know What’s Up

By Robert Bradley Jr. -- September 7, 2009

“For a long time now, science reporters have been confidently told the science is settled…. But I am confused [by recent developments]. Four years ago this all seemed like a fait accompli. Humans were unquestionably warming the climate and changing the planet forever through their emissions of carbon dioxide.”

– Eric Berger, Science Writer, Houston Chronicle, September 6, 2009 [SciGuy Blog]

In his post at MasterResource last week, Ken Green spoke of a potential “death spiral” for climate alarmism, in that the failure of the political process would make it less politically incorrect to challenge climate alarmism. “As hopes for a Gore-style ‘wrenching transformation’ fade,” wrote Green, “more mainstream scientists and opinion-makers will become more ‘practical’ toward the issue, meaning that alarmism may give way to sensible assessments of mitigation, adaptation, and geo-engineering.”

But the other problem for climate alarmism is nonalarmist data, as well as new studies by top climatologists questioning the guts of high-sensitivity climate models. Chip Knappenberger summarized a new study by Richard Lindzen that concluded that the “best guess” warming from the Intergovernmental Panel on Climate Change (IPCC) was radically overstated. Marlo Lewis’s summary, Is the Climate Science Debate Over? No, It’s Just Getting Very, Very Interesting (with welcome news for mankind), also lays out the latest from the quite unsettled–and nonalarmist–science. Are the Malthusians wrong again?

Enter Eric Berger, the open-minded, fair-minded science writer for the Houston Chronicle. With just a little courage, and no doubt a good deal of perplexity, he is asking the question that some have been asking for a long, long time: what is really going on here. And no doubt he will take some heat from his post, and no doubt he is going to get to the bottom of what is going on.

Jerry North (Texas A&M) Hints at the Problem

Eleven years ago, when I was director of public policy at Enron, I entered into a consulting agreement with Gerald North, Distinguished Professor of Atmospheric Sciences and Oceanography at Texas A&M’s Department of Atmospheric Sciences, to tell me what was going on. North was as close as I could find to a ‘middle of the roader’ between climate alarmism and (ultra) skepticism. He is also highly decorated.

And this has not changed. North’s own intuitive estimate of climate sensitivity is now 50% below the IPCC’s best guess, and he has been critical of a number of the climate mini-alarms that would make headlines and then fade away (more hurricanes, disruption of the thermohaline circulation, etc.).

But I noticed a Malthusian streak in North, that unstated assumption that nature is optimal, and the human influence on climate cannot be good but only bad–and maybe even catastrophic. Still, North in his emails to me–then and now–was rather blunt about the shortcomings of climate modeling.

Here is a sampling of quotations over the last decade:

“There is no doubt a small ‘sociological convergence’ effect, that tends to work here (individuals and their managers hate to be the outlier). The biggest problem is that doubling CO2 leads to a 1 deg C warming (I think even Lindzen agrees). If water vapor doubles it, we are at 2.0 (Lindzen differs here, but I do not know of anyone else). Are there any other feedbacks? It is hard to dismiss ice feedback, but it might be small. Clouds are positive in most models — I have always taken them to be neutral, but with no substantial reason (it’s just easier that way).”

“I do not think there is enough thinking going on. Just plugging in the numbers or running the simulations. Dick [Lindzen] is clearly right on this one.”

“I believe the ocean simulations are very primitive and quite variable from one group to another. The underlying reason is this: How much of the deep layers of the ocean are really participating in the warming?”

“There are pitifully few ways to test climate models.”

“[Models] sort of fake it (we call it ‘parameterization’). They do it in very crude ways such as if the temperature profile of the atmosphere is unstable, they make the whole column overturn, etc.”

“[The models’ treatment of feedbacks] could also be sociological: getting the socially acceptable answer.”

“I go back to my old position: we need more time, maybe a decade to get a better grip on aerosols, water vapor feedback, cloud feedback, ocean participation.”

“We have only a very loose grip on aerosols.”

“[The models] treat the ocean differently. Somehow, they are fudging the parameters that govern ocean coupling so that they get the right ocean delay to agree with the data in spite of their differing sensitivities.”

But before you call North a radical or tattletale on the ‘consensus’, consider what the IPCC said in the back of their latest assessement of the physical science of climate change:

“The set of available models may share fundamental inadequacies, the effects of which cannot be quantified.”

 – IPCC, Climate Change 2007: The Physical Science Basis (Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change). Cambridge, UK: Cambridge University Press, 2007, p. 805.

Is this a trick? Satisfy the science by stating the science–but do so on page 805 rather than in the executive summary where it belongs. It is this sort of thing that Eric Berger–and other open-minded middle-of-the-roaders–are going to find out. And they just might feel a little duped.

A ‘Skeptic’ Climate Model?

But if high-sensitivity models are errant, why isn’t there a “Lindzen” model? As it has been explained it to me, the microphysics of clouds and other key parameters are ‘sub-grid scale’ and beyond the capability of models. So models are inherently high-sensitivity and thus alarmist. Something, in this case, is worse than nothing.

What Eric Berger will find out, I believe, is that the ‘easy’ computable answer for climate models introduced an inherently biased, upward estimate of climate sensitivity. But the climate is much more complex that the (artificial) models, and more realistic physics (a la Lindzen) suggest climate to have less positive (and maybe neutral or negative) feedbacks. This would mean that CO2 is a benign, trace gas–not a harbinger of doom.

And given the political impasse (consumers like reliable, affordable  energy, thank you), this is good news for mankind, indeed!

How wide will the rethink–and the confessions–be? Can Richard Kerr at Science help us here? How about other science writers at the New York Times or Wall Street Journal? Will the Society of Environmental Journalists sponsor climate debates between high-sensitivity ‘alarmists’ and low-sensitivity ‘skeptics’? Will alarmists agree to debate at the next Heartland Institute climate conference scheduled in Chicago in May?

What is needed now more than ever is a “challenge culture”–even if it involves a bit of political incorrectness. This is physical science, after all, not political science.

Appendix A: “Climate Scientists Should Talk About What “May” Happen, Rather Than What ‘Will’ Happen” by Eric Berger

I’m the science reporter for the Houston Chronicle, the daily newspaper in the petrochemical capital of the United States, if not the world. I’ve been called a global warming skeptic by environmentalists, and I’ve been called an environmentalist toady by the skeptics.

I’m neither of these things. Rather, I’m just trying to grasp what is happening to the planet’s climate, and how humans are impacting it.

For a long time now, science reporters have been confidently told the science is settled. That the planet is warming and humans are unquestionably the primary cause. We’ve been told to trust the computer models — the models which show a markedly upward trend in temperatures as carbon dioxide concentrations increase. And I’ve trusted the scientists telling me this.

Below you’ll find the computer model forecasts for the 21st century temperatures from the most recent IPCC summary for policymakers, which call for a 1.8°C to 3.8°C rise in global temperatures by 2100:

ipcc2007ar4models.jpg

IPCC

It seems pretty clear that the models forecast a steady upward trend in global temperatures as long as carbon dioxide levels rise. (Which they have). Yet according to satellite and surface temperature measurements the global average temperature has essentially remained flat for the last 12 years. This strikes me as somewhat curious.

When An Inconvenient Truth came out I believed the movie to be scientifically accurate. Carbon dioxide levels were rising and so were temperatures. And hurricane activity, especially after the disastrous 2005 season, was out of control.

But a funny thing happened on the way to the end of the world: hurricane activity on the global scale is near historical lows. And the Earth seems to have, at least temporarily, stopped warming.

This, despite the fact that some of the country’s leading climate scientists say there is unequivocally a link between major hurricanes and climate change. And despite the fact that other leading climate scientists predicted 2009 or 2010 will go down as the warmest year in recorded history. Either prediction, if true, would be alarming.

Yet both of these predictions seem, at the present moment, to be off.

Then there’s this: a revealing story from an international meeting of climate scientists where a German climate scientist says the world may cool for the next decade or two. New Scientist reports:

One of the world’s top climate modelers said Thursday we could be about to enter “one or even two decades during which temperatures cool.

“People will say this is global warming disappearing,” he told more than 1500 of the world’s top climate scientists gathering in Geneva at the UN’s World Climate Conference.

“I am not one of the skeptics,” insisted Mojib Latif of the Leibniz Institute of Marine Sciences at Kiel University, Germany. “However, we have to ask the nasty questions ourselves or other people will do it.”

Few climate scientists go as far as Latif, an author for the Intergovernmental Panel on Climate Change. But more and more agree that the short-term prognosis for climate change is much less certain than once thought.

If we can’t have confidence in the short-term prognosis for climate change, how can we have full confidence in the long-term prognosis?

The article is significant for a couple of reasons. First of all it’s written by Fred Pearce, who has a history of forceful journalism outlining climate change’s perils, and it’s published by New Scientist, which has long advocated vigorous action to curb climate change. I respect both the author and the publication.

Secondly, the key point here is that scientists are acknowledging that natural variations are playing a very important role in our present and future climate, perhaps cooling it. Therefore it stands to reason that natural variations might also have played a role in the temperature run-up of the 20th century.

Do not misunderstand me. I am not a climate change skeptic. I do not deny that the planet warmed 0.6°C in the 20th century. I do not deny that humans played some part in that significant warming.

But I am confused. Four years ago this all seemed like a fait accompli. Humans were unquestionably warming the climate and changing the planet forever through their emissions of carbon dioxide.

The problem is that some climate scientists and environmentalists have been so determined to see something done about carbon dioxide emissions — now — that they have glossed over the uncertainties.

Uncertainties like: maybe there isn’t a linear relationship between carbon dioxide and temperature, and maybe the planet will cool for a couple of decades even as carbon dioxide emissions accelerate.

For the last few years some scientists and environmentalists have been telling us a lot about what “will” happen in the future if carbon dioxide emissions continue unabated. It perhaps would have been a lot better if they talked about what “may” happen.

12 Comments


  1. Andrew  

    “maybe there isn’t a linear relationship between carbon dioxide and temperature”

    “Maybe”? Good god, the TAR gave a formula for the forcing:

    ?F = ?ln(C/Co)

    Where alpha is~5.35 and C is the concentration and C0 is the baseline concentration.

    That’s not an uncertainty that’s a rock solid physical relationship. How can it be that Eric doesn’t know that everyone agrees that the relationship is not linear?

    Reply

  2. Andrew  

    “It seems pretty clear that the models forecast a steady upward trend in global temperatures as long as carbon dioxide levels rise. (Which they have). Yet according to satellite and surface temperature measurements the global average temperature has essentially remained flat for the last 12 years. This strikes me as somewhat curious.”

    I’m in the middle of examining this. It seems that at least a handful of models don’t rule out twelve year periods without warming:
    http://devoidofnulls.wordpress.com/2009/08/31/144-month-trends-in-models/

    However it certainly seems like most models aren’t compatible with that.

    Reply

  3. NucEngineer  

    There has been atmospheric cooling the last 8 years, and no new high global annual temperatures in the last 11 years. None of the computer models replicate this fact. Anthropogenic (or man caused) global warming is not proved.

    The global warming adherents base their argument of proof on more than 20 different computer models called general circulation models (also known as global climate models or GCMs). Each computer model is composed of dozens of mathematical equations representing known scientific laws, theories, and hypotheses. Each equation has one or more constants. The constants associated with known laws are very well defined. The constants associated with known theories are generally accepted but probably some of them may be off by a factor of 2 or more, maybe even an order of magnitude. The equations representing hypotheses, well, sometimes the hypotheses are just plain wrong. Then each of these equations has to be weighted against each other for use in the computer models, so that adds an additional variable (basically an educated guess) for each law, theory, and hypothesis. This is where the models are tweaked to mimic past climate measurements.

    The SCIENTIFIC METHOD is: (1) Following years of academic study of the known physical laws and accepted theories, and after reviewing some data, come up with a hypothesis to explain the data. (2) Develop a plan to obtain and analyze new data. (3) Collect and analyze the data, this may even require new technology not previously available. (4) Determine if the hypothesis is correct, needs refinement, or is wrong. Either way, new data is available for other researchers. (5) Submit results, including data, for peer review and publication.

    The output of the computer models run out nearly 90 years forward is considered to be data, but it is not a measurement of a physical phenomenon. Also, there is no way to analyze this so called data to determine if any or which of the hypotheses in the models are correct, need refinement, or are wrong. Also, this method cannot indicate if other new hypotheses need to be generated and incorporated into the models. IT JUST IS NOT THE SCIENTIFIC METHOD.

    The worst flaw in the AGW argument is the treatment of GCM computer generated outputs as data. They then use it in follow on hypotheses. For example, if temperature rises by X degrees in 50 years, then Y will be effected in such-and-such a way resulting in Z. Then the next person comes along and says, well, if Z happens, the effect on W will be a catastrophe. “I need (and deserve) more money to study the effects on W.” Hypotheses, stacked on hypotheses, stacked on more hypotheses, all based on computer outputs that are not data, using a process that does not lend to proof using the SCIENTIFIC METHOD. Look at their results, IF, MIGHT, and COULD are used throughout their news making results. And when one of the underlying hypotheses is proven incorrect, well, the public only remembers the doomsday results 2 or three iterations down the hypotheses train. The hypotheses downstream are not automatically thrown out and can even be used for more follow on hypotheses.

    Reply

  4. NucEngineer  

    You may find it interesting what the head of the IPCC said more than 1-1/2 years ago concerning the lack of new annual high global temperatures:
    http://www.reuters.com/article/idUSL1171501720080111
    Rajendra Pachauri, the head of the U.N. Panel that shared the 2007 Nobel Peace Prize with former U.S. Vice President Al Gore, said (more than 1-1/2 years ago) that he would look into the apparent temperature plateau so far this century.
    “One would really have to see on the basis of some analysis what this really represents,” he told Reuters more than 1-1/2 years ago, adding “are there natural factors compensating?” for increases in greenhouse gases from human activities.
    Also in this article from more than 1-1/2 years ago, Amir Delju, senior scientific coordinator of the World Meteorological Organization’s (WMO) climate program, said temperatures would have to be flat for several more years before a lack of new record years became significant.
    Well, we are over 3/4 of the way to being significant.

    Reply

  5. actuator  

    It is incredible that anyone who understands that we don’t know all of the variables that impact the climate nor do we know the variability of the cycles of the variables that we do know, that they would believe any credible computer model could be devised.

    The climate has always changed. It has been both hotter and colder than it is now. But what is the “ideal” temperature? Well we just don’t know do we. But we do know from a historical context that agriculture and human endeavor do seem to prefer warmth over cold.

    Reply

  6. Why Natural Gas Should Not Play the Cap-and-Trade Game (the real enemy is mandated renewables/conservation, not coal) — MasterResource  

    […] and in all its permutations. CO2 should not be regulated for all sorts of reasons, beginning with new development in climate science against alarmism and continuing with the high economic costs of even a watered down starter […]

    Reply

  7. T. Caine  

    “What is needed now more than ever is a “challenge culture”–even if it involves a bit of political incorrectness.”

    Absolutely. There is nothing to be lost by challenging debate. On the contrary, chances are we are bound to end up in a better place.

    I agree with those on both sides of the line that the climate is far more complex than our ability to measure or predict it, leaving us without indisputable proof that we are (or are not) a century away from cataclysmic results. But if the only result of preparing for the possibility of the worst is a society that is more efficient, pollutes less and lasts longer, why would we want to risk being wrong?

    Reply

  8. Andrew  

    Because it’s not the “only result”?

    Reply

  9. Of moles and whacking: “Climate models didn’t predict this lack of warming” « The Way Things Break  

    […] happen, rather than what ‘will’ happen”. Predictably, the post is being lauded in the […]

    Reply

  10. Is Joe Romm a ‘Global Lukewarmer’? — MasterResource  

    […] what it is worth, I agree with Revkin on this (tune in next week to see why) as do an increasing number of science writers who are hedging their bets in line with recent data and what new peer reviewed articles are […]

    Reply

  11. What Does the Last Decade Tell Us about Global Warming? (Hint: the ’skeptics’ have the momentum) — MasterResource  

    […] the Houston Chronicle’s Eric Berger made a pretty big deal about it in his recent SciGuy blog post, stating “But a funny thing happened on the way to the end […]

    Reply

Leave a Reply