Category — Climate science
What’s been happening recently in North Carolina (NC) is a microcosm of the anthropogenic global warming (AGW) story: politics versus science, ad-hominems versus journalism, evangelists versus pragmatists, etc.
The contentiousness is over one of the main AGW battlefields: sea-level rise (SLR). North Carolina happens to have a large amount of coastline and has become the U.S. epicenter for this issue.
The brief version is that this began several years ago when a state agency, the Coastal Resources Commission (CRC), selected a 20± member “science panel” to do a scientific assessment of the NC SLR situation through 2100. This could have been a very useful project if there had been balance in the personnel selections, and the panel’s assessment adhered to scientific standards. Regrettably, neither happened and the project soon jumped the rails, landing in the political agenda ditch.
In their 2010 report, the panel concluded that NC should expect a 39-inch SLR by 2100. Their case was built around a 2007 paper by Stefan Rahmstorf, and was not encumbered by a single reference to a perspective different from Rahmstorf’s. Shortly after the report was released, state agencies started making the rounds of North Carolina coastal communities, putting them on notice that they would need to make BIG changes (elevating roads and bridges, re-zoning property, changing flood maps for insurance purposes, etc.).
As an independent scientist, I was solicited by my coastal county to provide a scientific perspective on this report. Even though I wasn’t a SLR expert, I could clearly see that this document was a classic case of Confirmation Bias, as it violated several scientific standards. But to get into the technical specifics I solicited the inputs of about 40 international SLR experts (oceanographers, etc.). [Read more →]
June 12, 2012 12 Comments
The scientific findings of the human influence on the climate system have been, and perhaps will always be, a mixed bag. Assuming strong positive feedback effects, and thus a high climate sensitivity, it certainly can be argued that the bad outweighs the good. But if feedback effects are more neutral, the sign of the externality flips from negative to positive given that, on net, a moderately warmer, wetter, and CO2-fertilized world is quite arguably a better one.
Earth Day 2012 yesterday brought forth predictable cries of doom-and-gloom. But there are plenty of positives on closer inspection on the climate front, developments which have undoubtedly spilled over into making the earth a better place for humanity at large.
Here is my Top 10 list of positive climate developments based on the recent empirical data and the latest scientific literature:
April 23, 2012 45 Comments
On occasion, I have the opportunity to assist Dr. Patrick J. Michaels (Senior Fellow in Environmental Studies at the Cato Institute) in reviewing the latest scientific research on climate change. When we happen upon findings in the peer-reviewed scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press, Pat sometimes covers them over at the “Current Wisdom” section of the Cato@Liberty blog site.
His latest posting there highlights research findings that show that extreme weather events during last summer and the previous two winters can be fully explained by natural climate variability—and that “global warming” need not (and should not) be invoked.
This topic—whether or not weather extremes (or at least some portion of them) can be attributed to anthropogenic global warming (or, as Dr. Pielke Sr., prefers, anthropogenic climate change)—has been garnering a lot of attention as of late. It was a major reason for holding the House Subcommittee hearing last week, is a hot topic of discussion in the press, and is the subject of an in-progress major report from the Intergovernmental Panel on Climate Change (IPCC).
As such, I wanted to highlight some of the findings that Pat reported on. I encourage a visit to the full article “Overplaying the Human Contribution to Recent Weather Extremes” over at Cato@Liberty.
The Great Russian Heat Wave of 2010
A new paper by Randall Dole and colleagues from the Physical Sciences Division (PSD) of the Earth System Research Laboratory (ESRL) of the National Oceanic and Atmospheric Administration (NOAA) examined the events leading up to and causing the big heat wave in Russia last summer (which was also part of an atmospheric pattern that was connected to the floods in Pakistan). Here is what they found:
“Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the [climate] model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.”
As Pat commented, “Can’t be much clearer than that.”
Recent Winter Severity
From Pat’s article: [Read more →]
March 21, 2011 2 Comments
Recently, Roger Pielke Sr., Senior Research Associate at the University of Colorado-Boulder in the Department of Atmospheric and Oceanic Sciences (ATOC) at the University of Colorado Boulder, participated in the March 8, 2011 House of Representatives Energy and Commerce Committee Hearing, Climate Science and EPA’s Greenhouse Gas Regulation.
His succinct testimony, reprinted below, provides a viewpoint on climate change science that is (refreshingly) different from the somewhat limited one espoused by the IPCC. While the IPCC sports blinders that prevent it from seeing much beyond human emissions as being the primary culpable agent of climate change, Roger sees the much bigger, more complex, picture. And Roger suggests that a response considered for the big climate change picture would likely be much different from that being considered for human emissions alone.
(Other testimony from scientists and House members, including Q&A, can be found here.)
Testimony to the Subcommittee on Energy and Power entitled “Climate Science and EPA’s Greenhouse Gas Regulation”
Roger A. Pielke Sr.
University of Colorado at Boulder and Colorado State University
8 March 2011
I have worked throughout my career to improve environmental conditions, including air quality, by conducting research, teaching and also by providing scientifically rigorous information to policy makers. At the state level, I served two terms on the Colorado Air Quality Control Commission where we developed the oxygenated fuels program to reduce atmospheric CO emissions from vehicles, promulgated regulations to mandate strict controls on wood and coal burning in residential fireplaces and stoves, and on asbestos concentrations in the air.
Four Main Points
In my testimony today (and in more detail in my written testimony) I have four main points: [Read more →]
March 16, 2011 1 Comment
In a MasterResource article a few months back, I walked everyone through a series of recent scientific findings and described how they cast new light on how the total amount of observed global warming to date could be divvied upon among various causes. I ultimately concluded that the high confidence that the IPCC (and later echoed by the EPA) placed on the statement that “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations” was misplaced.
This line of reasoning was recently incorporated into statements made by Dr. Patrick Michaels when testifying before the U.S. House of Representatives, Committee on Science and Technology, Subcommittee on Energy and Environment.
During the questions and answers portion of the hearing, one of the other panelists, Dr. Benjamin Santer, quickly objected and claimed that Pat was “wrong” because he didn’t take into account the cooling influence of aerosols when determining how much observed warming should be assigned to greenhouse gases.
A day or so following the testimony, Judith Curry hosted a discussion on her blog site Climate Etc. to further examine Michaels’ logic. In her remarks introducing the thread, she too suggested that Pat was “obliged” to include sulfates in the calculation. When I stepped in to offer additional explanation, RealClimate’s Gavin Schmidt commented that he hoped I was “kidding,” and John Nielsen-Gammon of Texas A&M commented that my explanation was “nonsense.”
So with all these erudite folks claiming that Pat Michaels and I are wrong, I figured I ought to take another look into the logic behind our conclusions.
First let’s get a couple of things out of the way up front. The argument about whether or not the inclusion of sulfates is required to arrive at a logically correct conclusion has nothing whatsoever to do with the veracity and/or applicability of the scientific papers from which I’ve drawn some numbers (see my earlier post for details about these findings). I am not suggesting that there isn’t plenty of room to argue that aspect of things, just that such a discussion does not impinge on the discussion of our logic. So I’ll set aside discussion of those issues in order to focus on the topic at hand. [Read more →]
December 15, 2010 9 Comments
Back in January of 2009, I reviewed the implications of a then just-published article in Nature magazine that was billed as shedding important new light on some aspects of the long-term (since the late 1950s) temperature history across Antarctica.
The article, by Eric Steig and colleagues, described more warming taking place over a larger portion of Antarctica than had previously been recognized. The implication was that the temperature rise across Antarctica was not lagging behind the rest of the world and thus “not bucking the trend of global warming” as apparently some “contrarians” were claiming.
Now, that result must be tempered, as a new paper is forthcoming that improves upon the analytical technique developed by the Steig team and finds significantly less warming across the continent as a whole (about 50% less), and a different geographical pattern of temperature changes across Antarctica—results that fit more closely with the existing (that is, pre-Steig et al.) perception of what was going on down there. Basically, when a more correct analysis was performed, our understanding of what has been occurring in Antarctica has been firmed up, rather than being badly shaken—Antarctica, on average, has warmed a little bit over the past 50 years, with the largest and most significant warming being concentrated in the regions around the Antarctic Peninsula, rather than spread somewhat evenly across the continent (as the Steig et al. result showed).
But, perhaps the most interesting part of this story is that the new analysis grew from the blogosphere.
Soon after the Steig et al. article was published, it was being examined and critiqued on various blog sites. Among the criticisms was that the statistical technique pioneered by the Steig team was improperly implemented and that the published results were influenced by these inaccuracies.
An effort grew from these blog discussions to develop a better implementation of the methods and the results revealed a rather different picture of the patterns of temperature evolution across Antarctica than did the original Steig et al. paper (Figure 1).
Figure 1. Patterns of temperature changes over Antarctica (1957-2006) based on the new, updated analysis (left) and as reported by Steig et al. (right) (figure source: The Air Vent)
The Blogosphere at its Best
Now, this type of thing happens fairly frequently in blog space—a spirited critique of a scientific publication. But what doesn’t happen very frequently, is that the blog discussions are formalized and submitted to a scientific journal. And in this case, not only were they submitted, but after a lengthy and extremely thorough review process, the new, improved findings have been accepted for publication in the Journal of Climate—a very well-respected scientific journal.
This shows the utility of blogs at their best—initial informal critical discussion that hits upon a legitimate and important point of science, which is then formalized, submitted, and accepted into the peer-reviewed scientific literature, thereby making a much more permanent, citeable and, in fact, more widely accessible, contribution to the scientific knowledge base.
Kudos to Ryan O’Donnell, Nicholas Lewis, Steve McIntyre, and Jeff Condon.
To read more about how all this came to pass, please visit Jeff’s blog where some of the authors describe all that was involved from start to finish and include a preview of their results.
I wrote in my original MasterResource article “[The Steig et al. paper was] all in all a reasonable approach to the problem—but likely not the final word on the matter.” The new paper by O’Donnell et al. pretty much confirms this, and adds important new words to the story. Lead author Ryan O’Donnell describes it like this:
In my opinion, the Steig reconstruction was quite clever, and the general concept was sound. A few of the choices made during implementation were incorrect; a few were suboptimal. Importantly, if those are corrected, some of the results change. Also importantly, some do not. Hopefully some of the cautions outlined in our paper are incorporated into other, future work. Time will tell!
This is the way science is supposed to work. I am delighted to see the blogosphere opening the doors to scientific contributions to a wider audience. I hope this trend continues—science will be the better for it. But, importantly, to achieve this contribution, it requires a great deal of effort, persistence, and fortitude that extends far beyond a comment thread on a blog somewhere. I encourage more people who really are interested in making a lasting impact to grin and bear it and make the effort—it is an effort that can be rather painful, but which provides great satisfaction in the end, and best of all, it keeps science moving forward.
So blog away, but when you hit upon something that you think is scientifically important, take the time to write it up and send it in to a journal—the end result could be rewarding for all of us.
December 10, 2010 4 Comments
The Holy Grail of climate change is a quantity known as the climate sensitivity—that is, how much the average global surface temperature will change from a doubling of the atmospheric carbon dioxide concentration. If we knew this number, we would have a much better idea of what, climatologically, was headed our way in the future and could make plans accordingly.
Thus far, however, this prize has been elusive. Back in 1990, in its very first Assessment Report, the Intergovernmental Panel on Climate Change (IPCC) suggested that the climate sensitivity was somewhere between 1.5°C to 4.5°C. In its latest Fourth Assessment Report published in 2007, the IPCC said the climate sensitivity was likely to be between 2.0°C and 4.5°C, and unlikely be to less than 1.5°C. Not a whole heck of a lot more certain than where things stood 20 years ago—and this despite a veritable scientific crusade to determine a more precise value.
A predominant member of the quest is the University of Alabama in Huntsville’s Dr. Roy Spencer. Dr. Spencer has, for several years now, been trying to untangle climate feedbacks from climate forcings. If apparent feedbacks are really forcings, or vice versa, then the determination of climate sensitivity is confused and prone to being wrong (and likely erring on the high side).
Dr. Spencer has long held that what has generally been taken to be a positive feedback from cloud cover changes in response to climate warming (i.e. cloud changes act to further enhance a CO2-induced warming) is actually the other way around—random cloud cover changes force temperature changes. However, trying to demonstrate that this is the case has proven challenging, and trying to convince the general climate community has been virtually impossible.
To help bring his ideas to a wider audience, Dr. Spencer has written a book about his hypothesis and his research in support of it, and has now, after years of tireless pursuit, published a paper in the peer-reviewed scientific literature.
Realizing that his findings run counter to the extant mainstream view of things, he has taken the step to ask for “physical scientists everywhere” to try to debunk his ideas. The appeal for scrutiny is intended to serve both science and Dr. Spencer in helping to solidify and illuminate a potential new way forward to finding the elusive Grail.
Recently, Dr. Spencer has written a nice summary of his on-going research and what, in his views are its implications. Rather than having me rehash his synopsis, Dr. Spencer has graciously permitted us to reprint a piece that originally appeared on his excellent website (a site well-worth checking from time to time).
Hopefully, readers of MasterResource will find this cutting-edge climate research interesting, and I am sure that if any of you have any pertinent suggestions for Dr. Spencer regarding his work, he would be happy to hear them.
Here is the excerpt: [Read more →]
September 21, 2010 9 Comments
Last week, an advance copy of a paper to appear in the Proceedings of the National Academy of Sciences (PNAS) was released which reported that a collection of “experts” suggests that climate tipping points (codename for something bad but we don’t know exactly what) would be knocked over by 2200 if we stay on our current greenhouse gas emissions pathway (for about the next 200 years). Underlying these views is the experts’ opinions as to what the earth’s equilibrium climate sensitivity—the rise in global temperatures resulting from a doubling of the earth carbon dioxide concentration—likely is.
But do the experts opinions actually reflect the scientific knowledge on these subjects?
The answer is no.
In fact, the experts’ opinions tended towards the extreme, despite recent science which should have reeled them in. Which is a lesson in and of itself. [Read more →]
July 6, 2010 2 Comments
The wonderful “A billion here, a billion there, and pretty soon you’re talking real money” statement attributed to Senator Everett Dirksen may be apocryphal, but it remains a prescient warning to our nation’s leaders. At a time when Congress is throwing billions of dollars around like pocket change based on claims of scientists and engineers, a real quote of Dirksen may be equally important (Congressional Record: June 16, 1965, p. 13884):
One time in the House of Representatives [a colleague] told me a story about a proposition that a teacher put to a boy. He said, ‘Johnny, a cat fell in a well 100 feet deep. Suppose that cat climbed up 1 foot and then fell back 2 feet. How long would it take the cat to get out of the well?
Johnny worked assiduously with his slate and slate pencil for quite a while, and then when the teacher came down and said, ‘How are you getting along?’ Johnny said, ‘Teacher, if you give me another slate and a couple of slate pencils, I am pretty sure that in the next 30 minutes I can land that cat in hell.
The nation needs Johnny. In fact, it may be time we hired a team of people like Johnny for every large science-based policy proposal Congress contemplates funding.
Carbon Capture and Storage: A Known Boondoggle
Consider, for example, the $4.4 billion Congress is putting into carbon capture and sequestration (CCS) research, nearly half of that to come from the Kerry-Lieberman climate bill. As Robert Bryce points out in the New York Times, “That’s a lot of money for a technology whose adoption faces three potentially insurmountable hurdles: it greatly reduces the output of power plants; pipeline capacity to move the newly captured carbon dioxide is woefully insufficient; and the volume of waste material is staggering.” [Read more →]
May 18, 2010 6 Comments
In a recent New York Times article, economist Robert H. Frank–“The Economic Naturalist”–argues that fighting global warming through government intervention entails a small cost and promises a large benefit. Yet to cast serious doubts on his claim, all we need do is quote from U.S. government and IPCC reports. We find that even in a textbook implementation, it’s not obvious that government mitigation efforts deliver net benefits.
Of course in the real world, if the politicians and/or EPA starts intervening in the energy sector, their actions will be far from the economist’s theoretical ideal. Then the case for such policy activism falls apart.
Frank’s Pros/Cons of Intervention
Frank’s opening paragraphs nicely summarize his views on climate policy:
FORECASTS involving climate change are highly uncertain, denialists assert — a point that climate researchers themselves readily concede. The denialists view the uncertainty as strengthening their case for inaction, yet a careful weighing of the relevant costs and benefits supports taking exactly the opposite course.
Organizers of the recent climate conference in Copenhagen sought, unsuccessfully, to forge agreements to limit global warming to 3.6 degrees Fahrenheit by the end of the century. But even an increase that small would cause deadly harm. And far greater damage is likely if we do nothing.
Frank goes on to quote a new MIT study, which paints an alarming scenario of damages from warming if world governments sit on their hands. In contrast, Frank argues that the cost to the economy of limiting greenhouse gases is not in the same ballpark. He sums up with, “In short, the cost of preventing catastrophic climate change is astonishingly small, and it involves just a few simple changes in behavior.”
So if the risks of inaction are potentially catastrophic, while the costs of preventive government measures are relatively trivial, then who but a fool or a stooge for Big Oil would question the need for immediate intervention? [Read more →]
March 15, 2010 3 Comments