A free-market energy blog
Random header image... Refresh for more!

Posts from — October 2009

Industrial Wind Technology: Interview of Jon Boone by Allegheny Treasures

Editor note: Jon Boone’s previous post on industrial wind parks led to this interview by Michael Morgan of Allegheny Treasures, an information resource dedicated to preserving the historic mountains of West Virginia and understanding the impact of industrial wind installations. 

Introduction: It’s been extremely difficult to bridge the gap that exists between those who know little about the issue and those who have a more comprehensive understanding of the workings of the electrical grid and the related technologies that supply it, like wind energy.  For many, their only information comes from the local press, “green” promotions by so-called environmental organizations, and occasional visits to web sites dedicated to one side or the other.  It’s often a mind-boggling quagmire!

The following conversation with Jon Boone, who now lives in Oakland, MD after a 30 year career at the University of Maryland, College Park, is an attempt to bridge that gap, perhaps allowing us to better understand the limitations of and problems associated with industrial wind technology. He has no dog in the fight.

Allegheny Treasures – (Michael Morgan, interviewer):  Mr. Boone, wind developers and their supporters portray their technology as a viable source of renewable electricity, providing “nearly” free power by capturing the wind – a virtually inexhaustible source of energy. Their mantra is that wind energy is “free, clean, and green.” Can you explain your concern with this portrayal?

Mr. Boone:  Industrial wind technology is a meretricious commodity, attractive in a superficial way but without real value—seemingly plausible, even significant, but actually false and nugatory.  Those who would profit from it either economically or ideologically are engaged in wholesale deception. All adults should know that if something seems too good to be true, it almost always is. Although the wind itself may be “free,” the cost of converting it to electrical energy is extremely expensive. A 100MW wind project would cost, in today’s market, about $350 million, most of it paid for by taxpayers.

AT – MorganAnd—sorry to interrupt—what about its benefits, such as its alleged ability to shut down fossil fuel plants?

Boone:  In contrast to wind proponents’ alluring but empty promises of closed coal plants and reduced carbon emissions is this reality: wind energy is impotent while its environmental footprint is massive and malignant. It can’t dent a grape in the energy scheme of things; it’s a sideshow technology with great potential for mainline environmental harm. In some ways, it’s almost the perfect enterprise for our era, as it produces no meaningful product or service but is subsidized up to 80 percent by rate and taxpayers. Like many “celebrities,” it is famous for being famous, not for its actual performance.

AT – MorganWould you explain? [Read more →]

October 31, 2009   13 Comments

Political Capitalism: Understanding the Beast that Broke the Cage (Part I: what is political capitalism?)

Editor note: This piece is reproduced from the website www.politicalcaptitalism.org with the permission of the author. This post, the first in a series, is germane to the current debate over climate/energy legislation that is backed by a number of large U.S. corporations (Enron then; GE, Duke, DuPont, etc. now).

Political capitalism is a private-property, market-oriented system that is compromised by business-sponsored government intervention. It is a socioeconomic system in which many or most regulations, subsidies, and tax-code provisions result from the lobbying efforts of directly affected businesses and their allies.

Today in the United States, there is greater political transparency and competition between political elites than was evident in the business-dominated past (the 19th and most of the 20th centuries). Interventions routinely result from non-business special interests representing education, the environment, labor, minorities, religion, retirees, science, and taxpayers, among others. Still, business interests—unified or in opposition—are arguably the most important of the elites that compete for special government favor in American politics today.

There are two avenues to business success under a private-property, profit-and-loss system. When using the economic means, or free-market means, businessmen provide goods or services in an open market and rely on voluntary consumer patronage. When using the political means, businessmen obtain a governmental restriction or favor that provides the margin of success beyond what consumer preference alone would give. Market entrepreneurship is the way of capitalism; political entrepreneurship, or rent-seeking as it is known in the economics literature, is the way of political capitalism.

Business interests welcome competition for the things they buy (to minimize costs) far more than for things they sell. They may profess support for free enterprise in general but not in their particular area. There, competition is disparaged as “unbridled,” “cut-throat,” “excessive,” or “unfair,” and calls are made to constrain the free market.

Historian Gabriel Kolko has defined political capitalism as “the utilization of political outlets to attain conditions of stability, predictability, and security—to attain rationalization—in the economy.” [Read more →]

October 30, 2009   2 Comments

Dear Superfreakonomics Critics: Time Is Money in the Climate Debate Too

One of the ugliest battles in the blogosphere climate wars has involved the newly released Superfreakonomics, sequel to the best-selling Freakonomics. In the new book’s final chapter (available here in pdf), economist Steven Levitt and journalist Stephen Dubner set out to challenge the view that massively restricting carbon emissions is the only hope for averting planetwide catastrophe.

In this post I will link to some of the major commentary on the book so far, and then focus on U.C. Berkeley economist Brad DeLong’s specific claims that Levitt and Dubner’s arguments in support of geoengineering are somehow “bad economics.” As we’ll see, Levitt and Dubner might be wrong, but if so they are wrong because of the numbers. DeLong is painting their views as self-evidently absurd, but that’s only because he himself is overlooking a basic economic point.

The Background

Not surprisingly, the climate scientists and economists who are most vocal about the need for drastic emissions cutbacks were furious when the book’s contents began circulating. Joe Romm got the ball rolling with this fiery post; his ally in such matters, Paul Krugman, soon followed suit. Dubner defended himself and co-author Levitt against Romm’s accusations of intentional distortion in this post, and one of the primary sources for the chapter, physicist (and all-around guru) Nathan Myhrvold, defended himself from Romm’s accusations of ignorance here.

In the present post, [Read more →]

October 29, 2009   1 Comment

Is Texas Governor Perry Off Climate Base? (Groupthink vs. Science Revisited)

On October 16, 2009, the Houston Chronicle ran an Outlook piece by Dr. Ronald Sass– a fellow in global climate change at the Baker Institute and Professor of Natural Sciences emeritus at Rice University–complaining that Texas governor Rick Perry was getting his ideas about climate change from unreliable sources. Apparently, that Governor Perry is not hopping on the climate alarmism/policy activism bandwagon has Dr. Sass a bit concerned. Make no mistake about a political agenda of the giver of this advice that goes far beyond natural science issues.

Dr. Sass argues that the latest findings of the Intergovernmental Panel on Climate Change (IPCC) should be the end all and be all of the physical science debate. But he is behind the times. The IPCC report is several years old, and the latest theory and empirical data is pointing in more benign directions than at the height of the climate alarm in the late 1990s. Perhaps the governor’s stance isn’t as off-base as Sass would like his Chronicle’s readers to believe.

Some Good Points

Dr. Sass does make some good points about the science, but he attributes too much import to the implications of setting the Governor straight on them.

For instance, Dr. Sass, points out, and rightly so, that human activities—primarily the burning of fossil fuels for energy—have contributed to the build-up of the atmospheric concentration of carbon dioxide and an enhancement of the earth’s natural greenhouse effect. If Governor Perry is being advised otherwise, he is getting bad advice. 

And Dr. Sass is correct that an enhanced greenhouse effect will lead, in general, to a modified, and warmer, climate. Again, if the governor is getting advised differently,  he is being misinformed.

The Rest of the Story

However, from here Sass goes astray. 

It is not true that just because humans are modifying the climate that this will lead to a bad outcome and thus we should undertake immediate efforts to stop this (which is what Sass would like the Governor to do). [Read more →]

October 28, 2009   5 Comments

Kerry-Boxer: Its Bite is Worse than its Bark

Today, the Senate Environment and Public Works Committee will hold the first of three hearings on S. 1733, the Clean Energy Jobs and American Power Act,” also known as Kerry-Boxer, after its co-sponsors Senators John Kerry (D-MA) and Barbara Boxer (D-CA). Kerry-Boxer is the Senate companion bill to H.R. 2454, the American Clean Energy and Security Act (ACESA), also known as Waxman-Markey, after its co-sponsors Reps. Henry Waxman (D-CA) and Ed Markey (D-MA).

For those worried about the economic impacts of these bills, I bring unwelcome news: their bite is worse than their bark. Escalator clauses common to both bills, ignored in most previous analyses, are the setup for dramatic increases in regulatory stringency well beyond the bills’ explicit emission reduction targets. Similarly, “findings” presenting the “scientific” rationale for cap-and-trade are not mere rhetorical fluff but precedents for litigation targeting emission sources considerably smaller than those explicitly identified as “covered entities.”

The Economic Debate So Far

Much of the economic debate on Waxman-Markey and Kerry-Lieberman has been about the likely impacts of the specific emission-reduction targets proposed in these bills. The Heritage Foundation, National Black Chamber of Commerce, and American Council on Capital Formation/National Association of Manufacturers project substantial GPD and job losses. In contrast, the Environmental Protection Agency and Congressional Budget Office project much smaller costs.

EPA says Waxman-Markey would cost the average household only $140 annually. CBO puts the figure at about $175. Citing these estimates, proponents say cap-and-trade is cheap, costing only a postage stamp per person per day.

Heritage Foundation scholar Dr. David Kreutzer points out that EPA improperly used a technique called “discounting” to make the costs of cap-and-trade seem tiny: $140 is what a household would have to invest today, with compound interest, to pay the costs of Waxman-Markey in 2050. But according to EPA’s own numbers, in 2050 the cost to a family of four would be $2,700.

Kreutzer and his colleagues also note that the CBO analysis does not estimate the GDP loss from higher energy prices, only the cost of the energy ration coupons, 85% of which are to be distributed free-of-charge in the initial years of the program.

A Breakthrough Institute analysis suggests that Waxman-Markey may cost pennies on the dollar because the bill’s “offset” provisions could allow “covered entities” to increase their emissions at business-as-usual rates through 2030. An offset is a credit earned for investments in projects — either in developing countries or in U.S. economic sectors outside the cap, such as agriculture and forestry — that reduce, avoid, or sequester emissions. The bill allows capped sources to utilize up to 2 billion tons’ worth of domestic and international offset credits each year in lieu of reducing their own emissions. If the offsets option is “fully utilized,” says Breakthrough, “Emissions in sectors of the economy supposedly ‘capped’ by ACESA could continue to grow at BAU rates until as late as 2037.”

This assessment is not a realistic projection of what is likely to happen. [Read more →]

October 27, 2009   5 Comments

The Rest of Waxman–Markey: Caveat Emptor!

On June 26, 2009, the U.S. House of Representatives passed the Waxman-Markey climate bill, known also as the cap-and-trade bill. This is unfortunate because cap-and-trade takes up no more than 30 percent of its pages. The rest of the telephone-book-sized HR 2454 detailed new regulations, wealth transfers and taxes whose aggregate adverse impacts may well surpass those of cap-and-trade.

Here is a quick list of some important provisions of the American Clean Energy and Security Act of 2009, nicknamed the Enron Revitalization Act of 2009 here at MasterResource.

Still more encouragement for renewable resources that cannot pass market tests. A national “renewable portfolio standard” will require that 20 percent of the nation’s electricity in 2020 (relative to 2.8 percent today) come from sources the law defines as “renewable” or (to a limited degree) improvements in efficiency. For the past ten years, wind power is the only renewable whose output has grown substantially, and it is only viable because of a federal production tax credit. Other renewables are generally even less economic. The national RPS is special interest legislation for wind.

Research on fuels and technologies that harms consumers and reduces employment opportunities. Section 114 institutes a new $1 billion annual tax on power from coal- and gas-fired [Read more →]

October 26, 2009   4 Comments

Industrial Wind Plants: Bad Economics, Bad Ecology

Editor Note: Jon Boone, a lifelong environmentalist, co-founded the North American Bluebird Society and has consulted for the Roger Tory Peterson Institute in New York. He has been a formal intervenor in two Maryland Public Service Commission hearings and produced and directed the documentary, Life Under a Windplant.

Industrial wind technology is a meretricious commodity, attractive in a superficial way but without real value—seemingly plausible, even significant but actually false and nugatory.

Those who would profit from it either economically or ideologically are engaged in wholesale deception. For in contrast to their alluring but empty promises of closed coal plants and reduced carbon emissions is this reality: Wind energy is impotent while its environmental footprint is massive and malignant.

A wind project with a rated capacity of 100 MW, for example, with 40 skyscraper-sized turbines, would likely produce an annual average of only 27 MW, an imperceptible fraction of energy for most grid systems. More than 60% of the time, it would produce less than 27 MW, and at peak demand times, often produce nothing. It would rarely achieve its rated capacity, producing most at times of least demand. Whatever it generated would be continuously skittering, intensifying, magnifying the destabilizing effects of demand fluctuations, for wind volatility is virtually indistinguishable from the phenomenon of people whimsically turning their appliances off and on.

Moreover, the project could never produce capacity value—specified amounts of energy on demand, something that should be anathema to regulatory agencies, with their task of ensuring reliable, secure, affordable electricity. The ability of machines to perform as expected on demand is the basis of modernity, underlying contemporary systems of economic growth, wealth creation and well-being.

Machinery that doesn’t do this is quickly discarded, although this wasn’t the case for much of history (look at the early days of television or radio or even the automobile). Only in the last hundred years or so have has the West come to rely on machines with this standard. Capacity value allows society to go from pillar to post in accordance with its own schedule. Wind provides no capacity value and can pass no test for reliability; one can never be sure how much energy it will produce for any future time. And generating units that don’t provide capacity value cannot be reasonably—and favorably—compared with those that do.

“Windball” Waste

Adding wind instability to a grid may be an engineer’s idea of job security, but it is criminal for ratepayers, taxpayers, and a better environment. For the grid is then forced to extend itself. As the wind bounces randomly around the system, operators must continuously balance it to match supply precisely with demand, compensating for the ebb and flow much in the way flippers keep the steel ball in play during a game of pinball. [Read more →]

October 24, 2009   13 Comments

Climate Change: The Resilience Option (far better than climate stasis)

Climate Change: The Resilience Option
Kenneth P. Green
What Is Better, Climate Resilience or Climate Stasis?
In general, the mainstream response to the issue of climate change has been reactive, pessimistic, authoritarian, and resistant to change. Those alarmed about a changing climate would stand athwart the stream of climate history and cry “stop, enough!” Rather than working to cease human influence on climate, they want to find a way to make the climate stand still. This focus on creating climate stasis has led to policy proposals that would have been laughed at or dismissed as wacky conspiracy theories in the 1980s. But mainstream anti-climate-change activists are proposing nothing less than the establishment of global weather control through energy rationing, regulations, and taxes, all managed by a global bureaucracy with a goal of leading humanity into a future that will become smaller, more costly, and less dynamic over time. Environmental groups, along with organizations like the United Nations IPCC, are calling for nothing less than imposing climate stasis on a chaotic system.
Consider the climate bill now before Congress: the Waxman-Markey American Climate and Energy Security Act. Waxman-Markey sets the ambitious target of reducing total U.S. GHG emissions by 83 percent below 2005 levels by the year 2050 (with intermediate benchmarks at 2020 and 2030). Thus, the cap and the allowances sold pursuant to it will be lowered from a peak of 5.4 billion tons in 2016 to just a little over 1 billion tons in 2050. As my colleague Steven F. Hayward and I have pointed out elsewhere, these targets are absurd.   From Department of Energy (DOE) historical statistics on energy consumption, it is possible to estimate that the United States last emitted 1 billion tons in the year 1910, when the nation’s population was only 92 million people, per-capita income (in 2008 dollars) was only $6,196, and total GDP (also in 2008 dollars) was about $572 billion—about one-twenty-fifth the size of the U.S. economy today. By the year 2050, however, the United States is expected to have a population of 420 million, according to Census Bureau projections—more than four times the population of 1910. In order to reach the 83 percent reduction target, per-capita carbon dioxide (CO2) emissions will have to be no more than 2.4 tons per person—only one-quarter the level of per-capita emissions in 1910.
When did the United States last experience per-capita CO2 emissions of only 2.4 tons? From the limited historical data available, it appears that this was about 1875. In 1875, the nation’s GDP (in 2008 dollars) was $147 billion, per-capita income (in 2008 dollars) was $3,300, and the population was only 45 million.
My colleague Kevin A. Hassett, Hayward and I have also written elsewhere about the problems with cap-and-trade and suggested that a revenue-neutral carbon tax would be preferable,  but that, too, represents an effort to impose stasis on a dynamic system simply using more efficient means. A carbon tax is, to be sure, vastly superior to a cap-and-trade system, but there are doubts that it is politically possible to enact one in a way that is actually revenue-neutral and is not abused by politicians who will look to tax those they dislike and rebate the taxes to groups they favor, namely, those which are most inclined to vote for their party.
A more forward-looking, optimistic, and free-market approach to the risks of climate variability accepts that the climate has been, is, and will be variable; focuses on the risks of variability; and looks for ways to build resilience in the face of that change, regardless of cause.
Aaron Wildavsky’s Resilience Paradigm
Aaron Wildavsky, one of the great policy analysts of the late twentieth century, wrote extensively about the benefits of resilient social institutions. Wildavsky observed that possible risk-reduction interventions lie along a spectrum from resilient to interceptive. Resilient approaches maximize our ability to cope with risk by maintaining a dynamic, market-based, knowledge-building strategy. Interceptive interventions emphasize specific risk-reduction efforts that require certain specific actions and prohibit or restrict others.  But how do we decide, for a given risk such as climate change, whether an interceptive approach is more likely to provide greater safety than a resilient approach?
Employing both theory and empirical observation, Wildavsky observed that uncertainties about the likelihood or extent of any given risk and about the effectiveness of any intervention constrain risk-reduction decisions.  He clearly demonstrated that a strategy of risk-interception is likely to be successful only in situations of truly excellent information.
So, for example, for a power plant owner who knows that a particular part is going to burn out every 150 days an interception strategy of replacing the part every 149 days to prevent the risk is likely cost-effective. But where less information exists, more resilient strategies are likely to succeed, because interception will be either infeasible or expensive in such situations. If a power plant had 8,000 critical pieces of equipment that would create a fire upon failure but the plant owner did not know the failure rates of each piece, trying to intercept the risk by replacing pieces before they failed would be enormously costly. Further, trying to have backup systems on all 8,000 pieces would be technologically difficult and probably not financially feasible. Instead, a strategy of resilience, such as implementing a sophisticated fire-response system, is more likely to be a feasible and efficient way of dealing with this risk.
In the case of climate change, our knowledge of the nature and scope of risks and future conditions is low, and our knowledge about how to intervene to head off specific risks is small. This suggests that current policy approaches that focus on mitigating GHG emissions largely to the exclusion of everything else are simply a waste of attention and resources, and resilience should be considered the default climate strategy. And to a large extent, the resilience option is the complete opposite of the climate-stasis approach; it focuses on decentralization, deregulation, and freeing markets to maximize resilience.
Managing Risks with Resilience-building Policies
A vast range of risks has been discussed in the context of climate change, from flood to drought, threatened food supplies, more deadly insect-borne diseases; higher heat-related deaths; rising sea levels, and so forth. Several approaches economists and policy analysts have identified could help increase social resilience to such risks.
Eliminate risk subsidies. Predicted damages associated with sea levels and storms are high because of the popularity of such locales for high-density business and upscale residential development. As a result, damages from extreme coastal weather events have been hugely expensive. The damages from Hurricane Katrina, for example, reached over $150 billion.  The question, however, is why there was so much value that was so badly protected against completely predictable events? Levees and sea-walls were under-designed. Many houses and businesses were not insured against flood damage. As Charles Perrow observes in Our Next Catastrophe, “Even in areas known to be hazardous, only about 20 percent of homeowners purchase flood insurance, and less than 50 percent of businesses purchase flood and earthquake insurance in risky areas.”
The reason for much of that risk-taking is the role of state and federal governments as the insurer of last resort. People know that in the event of a disaster, even if uninsured, the Federal Emergency Management Agency will give grants to let people recover from natural disasters such as hurricanes, floods, and storm surges. Without such assurances, we can assume that many people would be unwilling to face the risk of living in coastal areas that could be flooded by rising sea levels, and would relocate to higher ground. Capital needed for businesses would also avoid areas of high-risk due to sea-level rise, preventing further siting of high-value structures in vulnerable areas. If risk subsidies cannot be abolished entirely, at the very least, they should charge risk-based premiums.
Privatize Infrastructure. Climate change could also pose a challenge for coastal or low-lying roadways, water-treatment facilities facing increased rainfall intensity, energy utilities facing increased summertime electricity demand, and so on. Governments are quite good at building infrastructure. After all, what politician does not enjoy a ribbon cutting ceremony for some new element of name-bearing infrastructure? But governments are dismal at maintaining infrastructure, as they generally fail to establish a revenue stream to maintain a system that provides feedback about whether a particular road should be raised or a water-treatment facility expanded or a power-capability increased. A solution to these problems, as well as a potential source of revenue for cash-strapped state and municipal governments is the privatization of infrastructure. While a few poorly executed privatization efforts have tarnished the name, the baby should not be thrown out with the bathwater; privatization offers a host of benefits. A great deal of research on privatization in developing and developed countries demonstrates that, on the whole, privatization shows considerably more benefit than risk. One reason is that private owners of infrastructure have a lot of investment tied up in getting a long-run stream of revenue from the infrastructure. Ensuring that future changes in climate do not disrupt that long- run cash flow is critical to their current financial performance.
Roadways. If roads are privately owned and tolled, road operators have a revenue stream to tap in order to raise, resurface, or re-contour roadways to adapt to climate changes. If costs of such adaptation are high, tolls will rise, and at some point, an economic decision will occur about whether a road should be maintained, or whether some alternate route should be developed. In some cases people may indeed find their transportation options so limited that they must move away to a place with a less fragile climate. One can imagine something like this for some coastal roadways where there are no easy alternate routes, but it would probably be a fairly rare outcome. Still, if such situations did develop, this is a desirable outcome, as it is both economically efficient and reduces the likely cost of climate-related damages to structures.
Electricity Supply. As long as governments distort the prices consumers pay for energy with subsidies, fuel mandates, renewable power mandates and the like, electricity markets cannot effectively adapt to changing climatic conditions. If electricity markets were fully deregulated, and if full costs were passed onto consumers, price signals would be created for the electricity provider in terms of expanding or decreasing capacity and for the consumer in terms of the real cost of living in an environment subject to energy-consuming heat waves (or cold snaps). Privatization would create incentives for electricity conservation and for the acquisition of energy-efficient appliances and devices without any need for specific governmental efficiency standards. Further, electric companies would be driven to connect with one another to ensure reliability to their customers rather than doing the minimum possible to satisfy regulators.
Water Supply. Full pricing of water and full privatization of the water supply, drinking water plants, and wastewater treatment plants would ameliorate many climatic risks incrementally over time, including flooding, seawater intrusion, and coastal and river pollution from storm runoff. Charging the full price for water, from supply to disposal, would create a price signal for consumers regarding the real risks they face living in hydrologically sensitive areas and create incentives for conservation while producing a revenue stream to allow for expanded capability or the securing of alternative supplies. At some point, again, high prices could simply lead people to move away from areas that are hydrologically costly, such as cities dependent on a single winter snow pack that shrinks or a single major river that suffers reduced flow.
Flooding. What is not achieved by removing insurance subsidies in flood-prone areas can be managed through the creation of privately administered hydrologic utilities, which would be financed by flood-protection fees charged to residents of flood prone areas. Again, such a system creates a price signal that can show when it is and when it is not efficient to raise the height of a levee, for example, or to expand permeable surfacing requirements in development. The cost of paying for such activities would send the consumer a signal about the true cost of living in flood-prone areas, and would ultimately lead those who could not afford to fully finance their level of risk to relocate to safer areas.
Trust in Resilience, but Tie up Your Camel
In the event that climate change does tend toward higher estimates put forward by the United Nations and other groups, it is reasonable to consider insurance options that might help deal with such climate changes. Such options might include government investment in geoengineering research, investment in research and development to advance technologies allowing the removal of GHGs from the atmosphere, and possibly the creation of a climate adaptation fund to be used where state and local governments find themselves unable to cope with a given climate change, or even to compensate others should it ultimately be shown that U.S. emissions of GHGs have caused harm to other countries or the property of other individuals.
It has long been known that certain types of risk are not suited to attempted prevention, but instead must be met with the resilience needed to live with the risk. Climate change is one such risk that is, as the world is increasingly observing, virtually impossible to prevent, whether it is manmade or natural.
As efforts to mitigate GHGs fail around the world, it is long past time to broaden the tools available to us in order to make our society resilient to climate risk. Rather than remain largely focused on the quixotic effort to reduce GHG emissions or to stand athwart the stream of climate and shout “stop, enough!” we should shift the majority of our policymaking attention to an agenda of resilience building and adaptation, two areas  with which governments particularly struggle. Plan B for climate resilience should consist of an aggressive program of resilience building through the elimination of risk subsidies, and the privatization of infrastructure. Other subsidies and regulations that make the overall economy more brittle in the face of climate change would also be ripe targets for removal, such as those which permeate energy and water markets.
Climate Change: The Resilience Option
Kenneth P. Green
What Is Better, Climate Resilience or Climate Stasis?
In general, the mainstream response to the issue of climate change has been reactive, pessimistic, authoritarian, and resistant to change. Those alarmed about a changing climate would stand athwart the stream of climate history and cry “stop, enough!” Rather than working to cease human influence on climate, they want to find a way to make the climate stand still. This focus on creating climate stasis has led to policy proposals that would have been laughed at or dismissed as wacky conspiracy theories in the 1980s. But mainstream anti-climate-change activists are proposing nothing less than the establishment of global weather control through energy rationing, regulations, and taxes, all managed by a global bureaucracy with a goal of leading humanity into a future that will become smaller, more costly, and less dynamic over time. Environmental groups, along with organizations like the United Nations IPCC, are calling for nothing less than imposing climate stasis on a chaotic system.
Consider the climate bill now before Congress: the Waxman-Markey American Climate and Energy Security Act. Waxman-Markey sets the ambitious target of reducing total U.S. GHG emissions by 83 percent below 2005 levels by the year 2050 (with intermediate benchmarks at 2020 and 2030). Thus, the cap and the allowances sold pursuant to it will be lowered from a peak of 5.4 billion tons in 2016 to just a little over 1 billion tons in 2050. As my colleague Steven F. Hayward and I have pointed out elsewhere, these targets are absurd.   From Department of Energy (DOE) historical statistics on energy consumption, it is possible to estimate that the United States last emitted 1 billion tons in the year 1910, when the nation’s population was only 92 million people, per-capita income (in 2008 dollars) was only $6,196, and total GDP (also in 2008 dollars) was about $572 billion—about one-twenty-fifth the size of the U.S. economy today. By the year 2050, however, the United States is expected to have a population of 420 million, according to Census Bureau projections—more than four times the population of 1910. In order to reach the 83 percent reduction target, per-capita carbon dioxide (CO2) emissions will have to be no more than 2.4 tons per person—only one-quarter the level of per-capita emissions in 1910.
When did the United States last experience per-capita CO2 emissions of only 2.4 tons? From the limited historical data available, it appears that this was about 1875. In 1875, the nation’s GDP (in 2008 dollars) was $147 billion, per-capita income (in 2008 dollars) was $3,300, and the population was only 45 million.
My colleague Kevin A. Hassett, Hayward and I have also written elsewhere about the problems with cap-and-trade and suggested that a revenue-neutral carbon tax would be preferable,  but that, too, represents an effort to impose stasis on a dynamic system simply using more efficient means. A carbon tax is, to be sure, vastly superior to a cap-and-trade system, but there are doubts that it is politically possible to enact one in a way that is actually revenue-neutral and is not abused by politicians who will look to tax those they dislike and rebate the taxes to groups they favor, namely, those which are most inclined to vote for their party.
A more forward-looking, optimistic, and free-market approach to the risks of climate variability accepts that the climate has been, is, and will be variable; focuses on the risks of variability; and looks for ways to build resilience in the face of that change, regardless of cause.
Aaron Wildavsky’s Resilience Paradigm
Aaron Wildavsky, one of the great policy analysts of the late twentieth century, wrote extensively about the benefits of resilient social institutions. Wildavsky observed that possible risk-reduction interventions lie along a spectrum from resilient to interceptive. Resilient approaches maximize our ability to cope with risk by maintaining a dynamic, market-based, knowledge-building strategy. Interceptive interventions emphasize specific risk-reduction efforts that require certain specific actions and prohibit or restrict others.  But how do we decide, for a given risk such as climate change, whether an interceptive approach is more likely to provide greater safety than a resilient approach?
Employing both theory and empirical observation, Wildavsky observed that uncertainties about the likelihood or extent of any given risk and about the effectiveness of any intervention constrain risk-reduction decisions.  He clearly demonstrated that a strategy of risk-interception is likely to be successful only in situations of truly excellent information.
So, for example, for a power plant owner who knows that a particular part is going to burn out every 150 days an interception strategy of replacing the part every 149 days to prevent the risk is likely cost-effective. But where less information exists, more resilient strategies are likely to succeed, because interception will be either infeasible or expensive in such situations. If a power plant had 8,000 critical pieces of equipment that would create a fire upon failure but the plant owner did not know the failure rates of each piece, trying to intercept the risk by replacing pieces before they failed would be enormously costly. Further, trying to have backup systems on all 8,000 pieces would be technologically difficult and probably not financially feasible. Instead, a strategy of resilience, such as implementing a sophisticated fire-response system, is more likely to be a feasible and efficient way of dealing with this risk.
In the case of climate change, our knowledge of the nature and scope of risks and future conditions is low, and our knowledge about how to intervene to head off specific risks is small. This suggests that current policy approaches that focus on mitigating GHG emissions largely to the exclusion of everything else are simply a waste of attention and resources, and resilience should be considered the default climate strategy. And to a large extent, the resilience option is the complete opposite of the climate-stasis approach; it focuses on decentralization, deregulation, and freeing markets to maximize resilience.
Managing Risks with Resilience-building Policies
A vast range of risks has been discussed in the context of climate change, from flood to drought, threatened food supplies, more deadly insect-borne diseases; higher heat-related deaths; rising sea levels, and so forth. Several approaches economists and policy analysts have identified could help increase social resilience to such risks.
Eliminate risk subsidies. Predicted damages associated with sea levels and storms are high because of the popularity of such locales for high-density business and upscale residential development. As a result, damages from extreme coastal weather events have been hugely expensive. The damages from Hurricane Katrina, for example, reached over $150 billion.  The question, however, is why there was so much value that was so badly protected against completely predictable events? Levees and sea-walls were under-designed. Many houses and businesses were not insured against flood damage. As Charles Perrow observes in Our Next Catastrophe, “Even in areas known to be hazardous, only about 20 percent of homeowners purchase flood insurance, and less than 50 percent of businesses purchase flood and earthquake insurance in risky areas.”
The reason for much of that risk-taking is the role of state and federal governments as the insurer of last resort. People know that in the event of a disaster, even if uninsured, the Federal Emergency Management Agency will give grants to let people recover from natural disasters such as hurricanes, floods, and storm surges. Without such assurances, we can assume that many people would be unwilling to face the risk of living in coastal areas that could be flooded by rising sea levels, and would relocate to higher ground. Capital needed for businesses would also avoid areas of high-risk due to sea-level rise, preventing further siting of high-value structures in vulnerable areas. If risk subsidies cannot be abolished entirely, at the very least, they should charge risk-based premiums.
Privatize Infrastructure. Climate change could also pose a challenge for coastal or low-lying roadways, water-treatment facilities facing increased rainfall intensity, energy utilities facing increased summertime electricity demand, and so on. Governments are quite good at building infrastructure. After all, what politician does not enjoy a ribbon cutting ceremony for some new element of name-bearing infrastructure? But governments are dismal at maintaining infrastructure, as they generally fail to establish a revenue stream to maintain a system that provides feedback about whether a particular road should be raised or a water-treatment facility expanded or a power-capability increased. A solution to these problems, as well as a potential source of revenue for cash-strapped state and municipal governments is the privatization of infrastructure. While a few poorly executed privatization efforts have tarnished the name, the baby should not be thrown out with the bathwater; privatization offers a host of benefits. A great deal of research on privatization in developing and developed countries demonstrates that, on the whole, privatization shows considerably more benefit than risk. One reason is that private owners of infrastructure have a lot of investment tied up in getting a long-run stream of revenue from the infrastructure. Ensuring that future changes in climate do not disrupt that long- run cash flow is critical to their current financial performance.
Roadways. If roads are privately owned and tolled, road operators have a revenue stream to tap in order to raise, resurface, or re-contour roadways to adapt to climate changes. If costs of such adaptation are high, tolls will rise, and at some point, an economic decision will occur about whether a road should be maintained, or whether some alternate route should be developed. In some cases people may indeed find their transportation options so limited that they must move away to a place with a less fragile climate. One can imagine something like this for some coastal roadways where there are no easy alternate routes, but it would probably be a fairly rare outcome. Still, if such situations did develop, this is a desirable outcome, as it is both economically efficient and reduces the likely cost of climate-related damages to structures.
Electricity Supply. As long as governments distort the prices consumers pay for energy with subsidies, fuel mandates, renewable power mandates and the like, electricity markets cannot effectively adapt to changing climatic conditions. If electricity markets were fully deregulated, and if full costs were passed onto consumers, price signals would be created for the electricity provider in terms of expanding or decreasing capacity and for the consumer in terms of the real cost of living in an environment subject to energy-consuming heat waves (or cold snaps). Privatization would create incentives for electricity conservation and for the acquisition of energy-efficient appliances and devices without any need for specific governmental efficiency standards. Further, electric companies would be driven to connect with one another to ensure reliability to their customers rather than doing the minimum possible to satisfy regulators.
Water Supply. Full pricing of water and full privatization of the water supply, drinking water plants, and wastewater treatment plants would ameliorate many climatic risks incrementally over time, including flooding, seawater intrusion, and coastal and river pollution from storm runoff. Charging the full price for water, from supply to disposal, would create a price signal for consumers regarding the real risks they face living in hydrologically sensitive areas and create incentives for conservation while producing a revenue stream to allow for expanded capability or the securing of alternative supplies. At some point, again, high prices could simply lead people to move away from areas that are hydrologically costly, such as cities dependent on a single winter snow pack that shrinks or a single major river that suffers reduced flow.
Flooding. What is not achieved by removing insurance subsidies in flood-prone areas can be managed through the creation of privately administered hydrologic utilities, which would be financed by flood-protection fees charged to residents of flood prone areas. Again, such a system creates a price signal that can show when it is and when it is not efficient to raise the height of a levee, for example, or to expand permeable surfacing requirements in development. The cost of paying for such activities would send the consumer a signal about the true cost of living in flood-prone areas, and would ultimately lead those who could not afford to fully finance their level of risk to relocate to safer areas.
Trust in Resilience, but Tie up Your Camel
In the event that climate change does tend toward higher estimates put forward by the United Nations and other groups, it is reasonable to consider insurance options that might help deal with such climate changes. Such options might include government investment in geoengineering research, investment in research and development to advance technologies allowing the removal of GHGs from the atmosphere, and possibly the creation of a climate adaptation fund to be used where state and local governments find themselves unable to cope with a given climate change, or even to compensate others should it ultimately be shown that U.S. emissions of GHGs have caused harm to other countries or the property of other individuals.
It has long been known that certain types of risk are not suited to attempted prevention, but instead must be met with the resilience needed to live with the risk. Climate change is one such risk that is, as the world is increasingly observing, virtually impossible to prevent, whether it is manmade or natural.
As efforts to mitigate GHGs fail around the world, it is long past time to broaden the tools available to us in order to make our society resilient to climate risk. Rather than remain largely focused on the quixotic effort to reduce GHG emissions or to stand athwart the stream of climate and shout “stop, enough!” we should shift the majority of our policymaking attention to an agenda of resilience building and adaptation, two areas  with which governments particularly struggle. Plan B for climate resilience should consist of an aggressive program of resilience building through the elimination of risk subsidies, and the privatization of infrastructure. Other subsidies and regulations that make the overall economy more brittle in the face of climate change would also be ripe targets for removal, such as those which permeate energy and water markets.
Climate Change: The Resilience Option
Kenneth P. Green
What Is Better, Climate Resilience or Climate Stasis?
In general, the mainstream response to the issue of climate change has been reactive, pessimistic, authoritarian, and resistant to change. Those alarmed about a changing climate would stand athwart the stream of climate history and cry “stop, enough!” Rather than working to cease human influence on climate, they want to find a way to make the climate stand still. This focus on creating climate stasis has led to policy proposals that would have been laughed at or dismissed as wacky conspiracy theories in the 1980s. But mainstream anti-climate-change activists are proposing nothing less than the establishment of global weather control through energy rationing, regulations, and taxes, all managed by a global bureaucracy with a goal of leading humanity into a future that will become smaller, more costly, and less dynamic over time. Environmental groups, along with organizations like the United Nations IPCC, are calling for nothing less than imposing climate stasis on a chaotic system.
Consider the climate bill now before Congress: the Waxman-Markey American Climate and Energy Security Act. Waxman-Markey sets the ambitious target of reducing total U.S. GHG emissions by 83 percent below 2005 levels by the year 2050 (with intermediate benchmarks at 2020 and 2030). Thus, the cap and the allowances sold pursuant to it will be lowered from a peak of 5.4 billion tons in 2016 to just a little over 1 billion tons in 2050. As my colleague Steven F. Hayward and I have pointed out elsewhere, these targets are absurd.   From Department of Energy (DOE) historical statistics on energy consumption, it is possible to estimate that the United States last emitted 1 billion tons in the year 1910, when the nation’s population was only 92 million people, per-capita income (in 2008 dollars) was only $6,196, and total GDP (also in 2008 dollars) was about $572 billion—about one-twenty-fifth the size of the U.S. economy today. By the year 2050, however, the United States is expected to have a population of 420 million, according to Census Bureau projections—more than four times the population of 1910. In order to reach the 83 percent reduction target, per-capita carbon dioxide (CO2) emissions will have to be no more than 2.4 tons per person—only one-quarter the level of per-capita emissions in 1910.
When did the United States last experience per-capita CO2 emissions of only 2.4 tons? From the limited historical data available, it appears that this was about 1875. In 1875, the nation’s GDP (in 2008 dollars) was $147 billion, per-capita income (in 2008 dollars) was $3,300, and the population was only 45 million.
My colleague Kevin A. Hassett, Hayward and I have also written elsewhere about the problems with cap-and-trade and suggested that a revenue-neutral carbon tax would be preferable,  but that, too, represents an effort to impose stasis on a dynamic system simply using more efficient means. A carbon tax is, to be sure, vastly superior to a cap-and-trade system, but there are doubts that it is politically possible to enact one in a way that is actually revenue-neutral and is not abused by politicians who will look to tax those they dislike and rebate the taxes to groups they favor, namely, those which are most inclined to vote for their party.
A more forward-looking, optimistic, and free-market approach to the risks of climate variability accepts that the climate has been, is, and will be variable; focuses on the risks of variability; and looks for ways to build resilience in the face of that change, regardless of cause.
Aaron Wildavsky’s Resilience Paradigm
Aaron Wildavsky, one of the great policy analysts of the late twentieth century, wrote extensively about the benefits of resilient social institutions. Wildavsky observed that possible risk-reduction interventions lie along a spectrum from resilient to interceptive. Resilient approaches maximize our ability to cope with risk by maintaining a dynamic, market-based, knowledge-building strategy. Interceptive interventions emphasize specific risk-reduction efforts that require certain specific actions and prohibit or restrict others.  But how do we decide, for a given risk such as climate change, whether an interceptive approach is more likely to provide greater safety than a resilient approach?
Employing both theory and empirical observation, Wildavsky observed that uncertainties about the likelihood or extent of any given risk and about the effectiveness of any intervention constrain risk-reduction decisions.  He clearly demonstrated that a strategy of risk-interception is likely to be successful only in situations of truly excellent information.
So, for example, for a power plant owner who knows that a particular part is going to burn out every 150 days an interception strategy of replacing the part every 149 days to prevent the risk is likely cost-effective. But where less information exists, more resilient strategies are likely to succeed, because interception will be either infeasible or expensive in such situations. If a power plant had 8,000 critical pieces of equipment that would create a fire upon failure but the plant owner did not know the failure rates of each piece, trying to intercept the risk by replacing pieces before they failed would be enormously costly. Further, trying to have backup systems on all 8,000 pieces would be technologically difficult and probably not financially feasible. Instead, a strategy of resilience, such as implementing a sophisticated fire-response system, is more likely to be a feasible and efficient way of dealing with this risk.
In the case of climate change, our knowledge of the nature and scope of risks and future conditions is low, and our knowledge about how to intervene to head off specific risks is small. This suggests that current policy approaches that focus on mitigating GHG emissions largely to the exclusion of everything else are simply a waste of attention and resources, and resilience should be considered the default climate strategy. And to a large extent, the resilience option is the complete opposite of the climate-stasis approach; it focuses on decentralization, deregulation, and freeing markets to maximize resilience.
Managing Risks with Resilience-building Policies
A vast range of risks has been discussed in the context of climate change, from flood to drought, threatened food supplies, more deadly insect-borne diseases; higher heat-related deaths; rising sea levels, and so forth. Several approaches economists and policy analysts have identified could help increase social resilience to such risks.
Eliminate risk subsidies. Predicted damages associated with sea levels and storms are high because of the popularity of such locales for high-density business and upscale residential development. As a result, damages from extreme coastal weather events have been hugely expensive. The damages from Hurricane Katrina, for example, reached over $150 billion.  The question, however, is why there was so much value that was so badly protected against completely predictable events? Levees and sea-walls were under-designed. Many houses and businesses were not insured against flood damage. As Charles Perrow observes in Our Next Catastrophe, “Even in areas known to be hazardous, only about 20 percent of homeowners purchase flood insurance, and less than 50 percent of businesses purchase flood and earthquake insurance in risky areas.”
The reason for much of that risk-taking is the role of state and federal governments as the insurer of last resort. People know that in the event of a disaster, even if uninsured, the Federal Emergency Management Agency will give grants to let people recover from natural disasters such as hurricanes, floods, and storm surges. Without such assurances, we can assume that many people would be unwilling to face the risk of living in coastal areas that could be flooded by rising sea levels, and would relocate to higher ground. Capital needed for businesses would also avoid areas of high-risk due to sea-level rise, preventing further siting of high-value structures in vulnerable areas. If risk subsidies cannot be abolished entirely, at the very least, they should charge risk-based premiums.
Privatize Infrastructure. Climate change could also pose a challenge for coastal or low-lying roadways, water-treatment facilities facing increased rainfall intensity, energy utilities facing increased summertime electricity demand, and so on. Governments are quite good at building infrastructure. After all, what politician does not enjoy a ribbon cutting ceremony for some new element of name-bearing infrastructure? But governments are dismal at maintaining infrastructure, as they generally fail to establish a revenue stream to maintain a system that provides feedback about whether a particular road should be raised or a water-treatment facility expanded or a power-capability increased. A solution to these problems, as well as a potential source of revenue for cash-strapped state and municipal governments is the privatization of infrastructure. While a few poorly executed privatization efforts have tarnished the name, the baby should not be thrown out with the bathwater; privatization offers a host of benefits. A great deal of research on privatization in developing and developed countries demonstrates that, on the whole, privatization shows considerably more benefit than risk. One reason is that private owners of infrastructure have a lot of investment tied up in getting a long-run stream of revenue from the infrastructure. Ensuring that future changes in climate do not disrupt that long- run cash flow is critical to their current financial performance.
Roadways. If roads are privately owned and tolled, road operators have a revenue stream to tap in order to raise, resurface, or re-contour roadways to adapt to climate changes. If costs of such adaptation are high, tolls will rise, and at some point, an economic decision will occur about whether a road should be maintained, or whether some alternate route should be developed. In some cases people may indeed find their transportation options so limited that they must move away to a place with a less fragile climate. One can imagine something like this for some coastal roadways where there are no easy alternate routes, but it would probably be a fairly rare outcome. Still, if such situations did develop, this is a desirable outcome, as it is both economically efficient and reduces the likely cost of climate-related damages to structures.
Electricity Supply. As long as governments distort the prices consumers pay for energy with subsidies, fuel mandates, renewable power mandates and the like, electricity markets cannot effectively adapt to changing climatic conditions. If electricity markets were fully deregulated, and if full costs were passed onto consumers, price signals would be created for the electricity provider in terms of expanding or decreasing capacity and for the consumer in terms of the real cost of living in an environment subject to energy-consuming heat waves (or cold snaps). Privatization would create incentives for electricity conservation and for the acquisition of energy-efficient appliances and devices without any need for specific governmental efficiency standards. Further, electric companies would be driven to connect with one another to ensure reliability to their customers rather than doing the minimum possible to satisfy regulators.
Water Supply. Full pricing of water and full privatization of the water supply, drinking water plants, and wastewater treatment plants would ameliorate many climatic risks incrementally over time, including flooding, seawater intrusion, and coastal and river pollution from storm runoff. Charging the full price for water, from supply to disposal, would create a price signal for consumers regarding the real risks they face living in hydrologically sensitive areas and create incentives for conservation while producing a revenue stream to allow for expanded capability or the securing of alternative supplies. At some point, again, high prices could simply lead people to move away from areas that are hydrologically costly, such as cities dependent on a single winter snow pack that shrinks or a single major river that suffers reduced flow.
Flooding. What is not achieved by removing insurance subsidies in flood-prone areas can be managed through the creation of privately administered hydrologic utilities, which would be financed by flood-protection fees charged to residents of flood prone areas. Again, such a system creates a price signal that can show when it is and when it is not efficient to raise the height of a levee, for example, or to expand permeable surfacing requirements in development. The cost of paying for such activities would send the consumer a signal about the true cost of living in flood-prone areas, and would ultimately lead those who could not afford to fully finance their level of risk to relocate to safer areas.
Trust in Resilience, but Tie up Your Camel
In the event that climate change does tend toward higher estimates put forward by the United Nations and other groups, it is reasonable to consider insurance options that might help deal with such climate changes. Such options might include government investment in geoengineering research, investment in research and development to advance technologies allowing the removal of GHGs from the atmosphere, and possibly the creation of a climate adaptation fund to be used where state and local governments find themselves unable to cope with a given climate change, or even to compensate others should it ultimately be shown that U.S. emissions of GHGs have caused harm to other countries or the property of other individuals.
It has long been known that certain types of risk are not suited to attempted prevention, but instead must be met with the resilience needed to live with the risk. Climate change is one such risk that is, as the world is increasingly observing, virtually impossible to prevent, whether it is manmade or natural.
As efforts to mitigate GHGs fail around the world, it is long past time to broaden the tools available to us in order to make our society resilient to climate risk. Rather than remain largely focused on the quixotic effort to reduce GHG emissions or to stand athwart the stream of climate and shout “stop, enough!” we should shift the majority of our policymaking attention to an agenda of resilience building and adaptation, two areas  with which governments particularly struggle. Plan B for climate resilience should consist of an aggressive program of resilience building through the elimination of risk subsidies, and the privatization of infrastructure. Other subsidies and regulations that make the overall economy more brittle in the face of climate change would also be ripe targets for removal, such as those which permeate energy and water markets.
1. Steven F. Hayward and Kenneth P. Green, “Waxman-Markey: An Exercise in Unreality,” AEI Energy and Environment Outlook, no. 3 (July 2009), available at www.aei.org/outlook/100057.
2. It is possible that per-capita CO2 emissions were never this low even before the advent of widespread use of fossil fuels: wood burning by Americans in the nineteenth century may have produced more than 2.4 tons of CO2 per capita. Much depends on the emissions coefficient for wood burning and how, since wood is biomass rather than a fossil fuel, reforestation is credited in carbon accounting. In 1875, burning wood generated twice as much energy as fossil fuels.
3. Kenneth P. Green, Steven F. Hayward, and Kevin A. Hassett, “Climate Change: Caps vs. Taxes,” AEI Environment and Energy Outlook, no. 2 (June 2007), available at www.aei.org/outlook/26286.
4. Aaron Wildavsky, Searching for Safety (New Brunswick, NJ: Transaction Publishers, 1988). Wildavsky used the terms “resilience” and “anticipation” rather than “resilience” and “interception.” In adapting Wildavsky’s framework to more recent risk-related issues, I have chosen to use “interception” because it corresponds better to common perceptions of how risk regulations work.
5. Ibid.
6. Mark L. Burton and Michael J. Hicks, “Hurricane Katrina, Preliminary Estimates of Commercial and Public Sector Damages” (Huntington, WV: Marshall University Center for Business and Economic Research, September 2005), available at www.marshall.edu/cber/research/katrina/Katrina-Estimates.pdf (accessed September 24, 2009).
7. Charles Perrow, The Next Catastrophe: Reducing Our Vulnerabilities to Natrual, Industrial, and Terrorist Disasters (Princeton, NJ: Princeton University Press, 2007): 37–38.
This post is an excerpt from a longer Environment and Energy Outlook, published by the American Enterprise Institute. The full study can be found here.

In general, the mainstream response to the issue of climate change has been reactive, pessimistic, authoritarian, and resistant to change. Those alarmed about a changing climate would stand athwart the stream of climate history and cry “stop, enough!” Rather than working to cease human influence on climate, they want to find a way to make the climate stand still. This focus on creating climate stasis has led to policy proposals that would have been laughed at or dismissed as wacky conspiracy theories in the 1980s. But mainstream anti-climate-change activists are proposing nothing less than the establishment of global weather control through energy rationing, regulations, and taxes, all managed by a global bureaucracy with a goal of leading humanity into a future that will become smaller, more costly, and less dynamic over time. Environmental groups, along with organizations like the United Nations IPCC, are calling for nothing less than imposing climate stasis on a chaotic system.

Consider the climate bill now before Congress: the Waxman-Markey American Climate and Energy Security Act. Waxman-Markey sets the ambitious target of reducing total U.S. GHG emissions by 83 percent below 2005 levels by the year 2050 (with intermediate benchmarks at 2020 and 2030). Thus, the cap and the allowances sold pursuant to it will be lowered from a peak of 5.4 billion tons in 2016 to just a little over 1 billion tons in 2050. As my colleague Steven F. Hayward and I have pointed out elsewhere, these targets are absurd.   From Department of Energy (DOE) historical statistics on energy consumption, it is possible to estimate that the United States last emitted 1 billion tons in the year 1910, when the nation’s population was only 92 million people, per-capita income (in 2008 dollars) was only $6,196, and total GDP (also in 2008 dollars) was about $572 billion—about one-twenty-fifth the size of the U.S. economy today. By the year 2050, however, the United States is expected to have a population of 420 million, according to Census Bureau projections—more than four times the population of 1910. In order to reach the 83 percent reduction target, per-capita carbon dioxide (CO2) emissions will have to be no more than 2.4 tons per person—only one-quarter the level of per-capita emissions in 1910.

When did the United States last experience per-capita CO2 emissions of only 2.4 tons? From the limited historical data available, it appears that this was about 1875. In 1875, the nation’s GDP (in 2008 dollars) was $147 billion, per-capita income (in 2008 dollars) was $3,300, and the population was only 45 million.

My colleague Kevin A. Hassett, Hayward and I have also written elsewhere about the problems with cap-and-trade and suggested that a revenue-neutral carbon tax would be preferable,  but that, too, represents an effort to impose stasis on a dynamic system simply using more efficient means. A carbon tax is, to be sure, vastly superior to a cap-and-trade system, but there are doubts that it is politically possible to enact one in a way that is actually revenue-neutral and is not abused by politicians who will look to tax those they dislike and rebate the taxes to groups they favor, namely, those which are most inclined to vote for their party.

A more forward-looking, optimistic, and free-market approach to the risks of climate variability accepts that the climate has been, is, and will be variable; focuses on the risks of variability; and looks for ways to build resilience in the face of that change, regardless of cause.

Aaron Wildavsky’s Resilience Paradigm

Aaron Wildavsky, one of the great policy analysts of the late twentieth century, wrote extensively about the benefits of resilient social institutions. [Read more →]

October 23, 2009   4 Comments

Okay, Joe Romm: How about a Wager on $65 Oil? (‘peak-oil’ bull or closet bear?)

[After publication of my New York Times Op-ed on peak oil, Joseph Romm posted a response—and a challenge—on his website, and later expanded it on The Huffington Post. Below is Michael Lynch's response.]

Thank you very much for your invitation to a wager on the price of oil, Joe, which I take to be serious, even though you made no effort to convey the wager to me personally. (If you were simply making a “‘pr” effort, feel free to withdraw it.) I would warn you that for most of my career I have been referred to as a ‘heretic’ or ‘contrarian’ and have repeatedly outperformed other forecasters by explaining (in a number of academic publications) why the forecasting of oil price and supply has been so deficient. That you appear to have been more prescient than me no doubt gives you confidence. But success can be misleading: in the long run, it’s better to be smart than lucky.

Parenthetically, bear in mind that I regard myself as a technocrat, not an ideologue, and my denigration of ‘peak oil’ theory comes from a careful reading of the ‘research’ as well as decades of research on the subject of oil and gas supply. I do not regard oil, gas, solar, nuclear or dung as ‘good’ or ‘bad’ except as the situation dictates.

In fact, for some years, I lived car-free, working and residing in the Boston area, dependent on the subway. Owing to an unfortunate editing error, my op-ed implied that I disapproved of conservation, which is most certainly not the case. Conservation can be one of the cheapest energy ‘sources’ around, depending on the circumstances. (Conservationism is another story.)

Back to 1996

Regarding our appearance at the Congressional hearing in 1996 (for which your post criticizes me): [Read more →]

October 21, 2009   9 Comments

High Capital Costs Plague Solar (RPS mandates, cost dilution via energy mixing required) Part II

Renewable energy generates a larger portion of the world’s electricity each year. But in relative terms, solar power generation is hardly a blip on the energy screen despite its long history of technological development. Solar-generated electricity has one major advantage over it’s more ubiquitous cousin wind power: electricity is generated during typical peak demand hours making this option attractive to utilities that value solar electricity for peak shaving. However, the capital cost of all the solar technologies are about $5,000/kW and higher and projects are moving forward only in particular regions within the U.S. with tough RPS requirements and subsidies from states and the federal government.

In Part I, we reviewed the enormous scale and capital cost considerations of photovoltaic projects and then introduced the standard taxonomy of central solar power generating plants. By far the favored technology for utility-scale projects is the concentrated solar power (CSP) option that either produces thermal energy that produces electricity in the familiar steam turbine process or by concentrating the sun’s thermal energy on an air heat exchanger to produce electricity via a gas turbine. In this Part II, we review a sampling of recent projects. In sum, CSP and Stirling engine technology appears to be favored in the U.S., while the “turbine on a stick” projects are gaining a foothold elsewhere.

The final post will explore the latest developments in hybrid projects that combine many of the available solar energy conversion technologies with conventional fossil-fueled technologies. Hybrid projects offer the opportunity for utilities to reduce fuel costs, while simultaneously helping utilities cope with onerous renewable portfolio mandates.

Pacific Gas and Electric Co. (PG&E) was the most solar-integrated utility in the U.S. last year, followed by Southern California Edison and San Diego Gas & Electric, according to new rankings released earlier this year by the Solar Electric Power Association (SEPA). It’s no great surprise all three utilities serve California residents.

PG&E interconnected 85 MW of new capacity—a number representing 44% of the survey total, the trade group found in its “2008 Top Ten Utility Solar Integration Rankings.” The report surveyed 92 utilities, identifying those that have the most significant amounts of solar electricity integrated into their portfolio. On a cumulative solar megawatt basis, Southern California Edison was ranked first, followed by PG&E, and Nevada utility NV Energy. “This year, the report demonstrated that the utility segment is making a major investment to increase the amount of solar energy in power portfolios, with many utilities doubling the amount of solar power in their portfolio in just one year,” SEPA said. The overall installed solar capacity of the top 10 ranked utilities rose from 711 MW to 882 MW, reflecting a 25% growth. SEPA cited renewable portfolio standards, impending carbon policy, and fluctuating costs of power generation and fuel resources as primary factors driving this growth.

Participating utilities had an average of 11 MW in their cumulative portfolio, and the top 10 utilities represented 93% of all solar capacity. Because of their head start, the large investor-owned utilities in California are likely to retain a lead in the overall cumulative rankings even as the year-to-year rankings shift, SEPA said. [Read more →]

October 20, 2009   1 Comment