Posts from — October 2009
Editor note: Jon Boone’s previous post on industrial wind parks led to this interview by Michael Morgan of Allegheny Treasures, an information resource dedicated to preserving the historic mountains of West Virginia and understanding the impact of industrial wind installations.
Introduction: It’s been extremely difficult to bridge the gap that exists between those who know little about the issue and those who have a more comprehensive understanding of the workings of the electrical grid and the related technologies that supply it, like wind energy. For many, their only information comes from the local press, “green” promotions by so-called environmental organizations, and occasional visits to web sites dedicated to one side or the other. It’s often a mind-boggling quagmire!
The following conversation with Jon Boone, who now lives in Oakland, MD after a 30 year career at the University of Maryland, College Park, is an attempt to bridge that gap, perhaps allowing us to better understand the limitations of and problems associated with industrial wind technology. He has no dog in the fight.
Allegheny Treasures – (Michael Morgan, interviewer): Mr. Boone, wind developers and their supporters portray their technology as a viable source of renewable electricity, providing “nearly” free power by capturing the wind – a virtually inexhaustible source of energy. Their mantra is that wind energy is “free, clean, and green.” Can you explain your concern with this portrayal?
Mr. Boone: Industrial wind technology is a meretricious commodity, attractive in a superficial way but without real value—seemingly plausible, even significant, but actually false and nugatory. Those who would profit from it either economically or ideologically are engaged in wholesale deception. All adults should know that if something seems too good to be true, it almost always is. Although the wind itself may be “free,” the cost of converting it to electrical energy is extremely expensive. A 100MW wind project would cost, in today’s market, about $350 million, most of it paid for by taxpayers.
AT – Morgan: And—sorry to interrupt—what about its benefits, such as its alleged ability to shut down fossil fuel plants?
Boone: In contrast to wind proponents’ alluring but empty promises of closed coal plants and reduced carbon emissions is this reality: wind energy is impotent while its environmental footprint is massive and malignant. It can’t dent a grape in the energy scheme of things; it’s a sideshow technology with great potential for mainline environmental harm. In some ways, it’s almost the perfect enterprise for our era, as it produces no meaningful product or service but is subsidized up to 80 percent by rate and taxpayers. Like many “celebrities,” it is famous for being famous, not for its actual performance.
AT – Morgan: Would you explain? [Read more →]
October 31, 2009 13 Comments
Political Capitalism: Understanding the Beast that Broke the Cage (Part I: what is political capitalism?)
Editor note: This piece is reproduced from the website www.politicalcaptitalism.org with the permission of the author. This post, the first in a series, is germane to the current debate over climate/energy legislation that is backed by a number of large U.S. corporations (Enron then; GE, Duke, DuPont, etc. now).
Political capitalism is a private-property, market-oriented system that is compromised by business-sponsored government intervention. It is a socioeconomic system in which many or most regulations, subsidies, and tax-code provisions result from the lobbying efforts of directly affected businesses and their allies.
Today in the United States, there is greater political transparency and competition between political elites than was evident in the business-dominated past (the 19th and most of the 20th centuries). Interventions routinely result from non-business special interests representing education, the environment, labor, minorities, religion, retirees, science, and taxpayers, among others. Still, business interests—unified or in opposition—are arguably the most important of the elites that compete for special government favor in American politics today.
There are two avenues to business success under a private-property, profit-and-loss system. When using the economic means, or free-market means, businessmen provide goods or services in an open market and rely on voluntary consumer patronage. When using the political means, businessmen obtain a governmental restriction or favor that provides the margin of success beyond what consumer preference alone would give. Market entrepreneurship is the way of capitalism; political entrepreneurship, or rent-seeking as it is known in the economics literature, is the way of political capitalism.
Business interests welcome competition for the things they buy (to minimize costs) far more than for things they sell. They may profess support for free enterprise in general but not in their particular area. There, competition is disparaged as “unbridled,” “cut-throat,” “excessive,” or “unfair,” and calls are made to constrain the free market.
Historian Gabriel Kolko has defined political capitalism as “the utilization of political outlets to attain conditions of stability, predictability, and security—to attain rationalization—in the economy.” [Read more →]
October 30, 2009 2 Comments
One of the ugliest battles in the blogosphere climate wars has involved the newly released Superfreakonomics, sequel to the best-selling Freakonomics. In the new book’s final chapter (available here in pdf), economist Steven Levitt and journalist Stephen Dubner set out to challenge the view that massively restricting carbon emissions is the only hope for averting planetwide catastrophe.
In this post I will link to some of the major commentary on the book so far, and then focus on U.C. Berkeley economist Brad DeLong’s specific claims that Levitt and Dubner’s arguments in support of geoengineering are somehow “bad economics.” As we’ll see, Levitt and Dubner might be wrong, but if so they are wrong because of the numbers. DeLong is painting their views as self-evidently absurd, but that’s only because he himself is overlooking a basic economic point.
Not surprisingly, the climate scientists and economists who are most vocal about the need for drastic emissions cutbacks were furious when the book’s contents began circulating. Joe Romm got the ball rolling with this fiery post; his ally in such matters, Paul Krugman, soon followed suit. Dubner defended himself and co-author Levitt against Romm’s accusations of intentional distortion in this post, and one of the primary sources for the chapter, physicist (and all-around guru) Nathan Myhrvold, defended himself from Romm’s accusations of ignorance here.
In the present post, [Read more →]
October 29, 2009 1 Comment
On October 16, 2009, the Houston Chronicle ran an Outlook piece by Dr. Ronald Sass– a fellow in global climate change at the Baker Institute and Professor of Natural Sciences emeritus at Rice University–complaining that Texas governor Rick Perry was getting his ideas about climate change from unreliable sources. Apparently, that Governor Perry is not hopping on the climate alarmism/policy activism bandwagon has Dr. Sass a bit concerned. Make no mistake about a political agenda of the giver of this advice that goes far beyond natural science issues.
Dr. Sass argues that the latest findings of the Intergovernmental Panel on Climate Change (IPCC) should be the end all and be all of the physical science debate. But he is behind the times. The IPCC report is several years old, and the latest theory and empirical data is pointing in more benign directions than at the height of the climate alarm in the late 1990s. Perhaps the governor’s stance isn’t as off-base as Sass would like his Chronicle’s readers to believe.
Some Good Points
Dr. Sass does make some good points about the science, but he attributes too much import to the implications of setting the Governor straight on them.
For instance, Dr. Sass, points out, and rightly so, that human activities—primarily the burning of fossil fuels for energy—have contributed to the build-up of the atmospheric concentration of carbon dioxide and an enhancement of the earth’s natural greenhouse effect. If Governor Perry is being advised otherwise, he is getting bad advice.
And Dr. Sass is correct that an enhanced greenhouse effect will lead, in general, to a modified, and warmer, climate. Again, if the governor is getting advised differently, he is being misinformed.
The Rest of the Story
However, from here Sass goes astray.
It is not true that just because humans are modifying the climate that this will lead to a bad outcome and thus we should undertake immediate efforts to stop this (which is what Sass would like the Governor to do). [Read more →]
October 28, 2009 5 Comments
Today, the Senate Environment and Public Works Committee will hold the first of three hearings on S. 1733, the Clean Energy Jobs and American Power Act,” also known as Kerry-Boxer, after its co-sponsors Senators John Kerry (D-MA) and Barbara Boxer (D-CA). Kerry-Boxer is the Senate companion bill to H.R. 2454, the American Clean Energy and Security Act (ACESA), also known as Waxman-Markey, after its co-sponsors Reps. Henry Waxman (D-CA) and Ed Markey (D-MA).
For those worried about the economic impacts of these bills, I bring unwelcome news: their bite is worse than their bark. Escalator clauses common to both bills, ignored in most previous analyses, are the setup for dramatic increases in regulatory stringency well beyond the bills’ explicit emission reduction targets. Similarly, “findings” presenting the “scientific” rationale for cap-and-trade are not mere rhetorical fluff but precedents for litigation targeting emission sources considerably smaller than those explicitly identified as “covered entities.”
The Economic Debate So Far
Much of the economic debate on Waxman-Markey and Kerry-Lieberman has been about the likely impacts of the specific emission-reduction targets proposed in these bills. The Heritage Foundation, National Black Chamber of Commerce, and American Council on Capital Formation/National Association of Manufacturers project substantial GPD and job losses. In contrast, the Environmental Protection Agency and Congressional Budget Office project much smaller costs.
EPA says Waxman-Markey would cost the average household only $140 annually. CBO puts the figure at about $175. Citing these estimates, proponents say cap-and-trade is cheap, costing only a postage stamp per person per day.
Heritage Foundation scholar Dr. David Kreutzer points out that EPA improperly used a technique called “discounting” to make the costs of cap-and-trade seem tiny: $140 is what a household would have to invest today, with compound interest, to pay the costs of Waxman-Markey in 2050. But according to EPA’s own numbers, in 2050 the cost to a family of four would be $2,700.
Kreutzer and his colleagues also note that the CBO analysis does not estimate the GDP loss from higher energy prices, only the cost of the energy ration coupons, 85% of which are to be distributed free-of-charge in the initial years of the program.
A Breakthrough Institute analysis suggests that Waxman-Markey may cost pennies on the dollar because the bill’s “offset” provisions could allow “covered entities” to increase their emissions at business-as-usual rates through 2030. An offset is a credit earned for investments in projects — either in developing countries or in U.S. economic sectors outside the cap, such as agriculture and forestry — that reduce, avoid, or sequester emissions. The bill allows capped sources to utilize up to 2 billion tons’ worth of domestic and international offset credits each year in lieu of reducing their own emissions. If the offsets option is “fully utilized,” says Breakthrough, “Emissions in sectors of the economy supposedly ‘capped’ by ACESA could continue to grow at BAU rates until as late as 2037.”
This assessment is not a realistic projection of what is likely to happen. [Read more →]
October 27, 2009 5 Comments
On June 26, 2009, the U.S. House of Representatives passed the Waxman-Markey climate bill, known also as the cap-and-trade bill. This is unfortunate because cap-and-trade takes up no more than 30 percent of its pages. The rest of the telephone-book-sized HR 2454 detailed new regulations, wealth transfers and taxes whose aggregate adverse impacts may well surpass those of cap-and-trade.
Still more encouragement for renewable resources that cannot pass market tests. A national “renewable portfolio standard” will require that 20 percent of the nation’s electricity in 2020 (relative to 2.8 percent today) come from sources the law defines as “renewable” or (to a limited degree) improvements in efficiency. For the past ten years, wind power is the only renewable whose output has grown substantially, and it is only viable because of a federal production tax credit. Other renewables are generally even less economic. The national RPS is special interest legislation for wind.
Research on fuels and technologies that harms consumers and reduces employment opportunities. Section 114 institutes a new $1 billion annual tax on power from coal- and gas-fired [Read more →]
October 26, 2009 4 Comments
Editor Note: Jon Boone, a lifelong environmentalist, co-founded the North American Bluebird Society and has consulted for the Roger Tory Peterson Institute in New York. He has been a formal intervenor in two Maryland Public Service Commission hearings and produced and directed the documentary, Life Under a Windplant.
Industrial wind technology is a meretricious commodity, attractive in a superficial way but without real value—seemingly plausible, even significant but actually false and nugatory.
Those who would profit from it either economically or ideologically are engaged in wholesale deception. For in contrast to their alluring but empty promises of closed coal plants and reduced carbon emissions is this reality: Wind energy is impotent while its environmental footprint is massive and malignant.
A wind project with a rated capacity of 100 MW, for example, with 40 skyscraper-sized turbines, would likely produce an annual average of only 27 MW, an imperceptible fraction of energy for most grid systems. More than 60% of the time, it would produce less than 27 MW, and at peak demand times, often produce nothing. It would rarely achieve its rated capacity, producing most at times of least demand. Whatever it generated would be continuously skittering, intensifying, magnifying the destabilizing effects of demand fluctuations, for wind volatility is virtually indistinguishable from the phenomenon of people whimsically turning their appliances off and on.
Moreover, the project could never produce capacity value—specified amounts of energy on demand, something that should be anathema to regulatory agencies, with their task of ensuring reliable, secure, affordable electricity. The ability of machines to perform as expected on demand is the basis of modernity, underlying contemporary systems of economic growth, wealth creation and well-being.
Machinery that doesn’t do this is quickly discarded, although this wasn’t the case for much of history (look at the early days of television or radio or even the automobile). Only in the last hundred years or so have has the West come to rely on machines with this standard. Capacity value allows society to go from pillar to post in accordance with its own schedule. Wind provides no capacity value and can pass no test for reliability; one can never be sure how much energy it will produce for any future time. And generating units that don’t provide capacity value cannot be reasonably—and favorably—compared with those that do.
Adding wind instability to a grid may be an engineer’s idea of job security, but it is criminal for ratepayers, taxpayers, and a better environment. For the grid is then forced to extend itself. As the wind bounces randomly around the system, operators must continuously balance it to match supply precisely with demand, compensating for the ebb and flow much in the way flippers keep the steel ball in play during a game of pinball. [Read more →]
October 24, 2009 13 Comments
In general, the mainstream response to the issue of climate change has been reactive, pessimistic, authoritarian, and resistant to change. Those alarmed about a changing climate would stand athwart the stream of climate history and cry “stop, enough!” Rather than working to cease human influence on climate, they want to find a way to make the climate stand still. This focus on creating climate stasis has led to policy proposals that would have been laughed at or dismissed as wacky conspiracy theories in the 1980s. But mainstream anti-climate-change activists are proposing nothing less than the establishment of global weather control through energy rationing, regulations, and taxes, all managed by a global bureaucracy with a goal of leading humanity into a future that will become smaller, more costly, and less dynamic over time. Environmental groups, along with organizations like the United Nations IPCC, are calling for nothing less than imposing climate stasis on a chaotic system.
Consider the climate bill now before Congress: the Waxman-Markey American Climate and Energy Security Act. Waxman-Markey sets the ambitious target of reducing total U.S. GHG emissions by 83 percent below 2005 levels by the year 2050 (with intermediate benchmarks at 2020 and 2030). Thus, the cap and the allowances sold pursuant to it will be lowered from a peak of 5.4 billion tons in 2016 to just a little over 1 billion tons in 2050. As my colleague Steven F. Hayward and I have pointed out elsewhere, these targets are absurd. From Department of Energy (DOE) historical statistics on energy consumption, it is possible to estimate that the United States last emitted 1 billion tons in the year 1910, when the nation’s population was only 92 million people, per-capita income (in 2008 dollars) was only $6,196, and total GDP (also in 2008 dollars) was about $572 billion—about one-twenty-fifth the size of the U.S. economy today. By the year 2050, however, the United States is expected to have a population of 420 million, according to Census Bureau projections—more than four times the population of 1910. In order to reach the 83 percent reduction target, per-capita carbon dioxide (CO2) emissions will have to be no more than 2.4 tons per person—only one-quarter the level of per-capita emissions in 1910.
When did the United States last experience per-capita CO2 emissions of only 2.4 tons? From the limited historical data available, it appears that this was about 1875. In 1875, the nation’s GDP (in 2008 dollars) was $147 billion, per-capita income (in 2008 dollars) was $3,300, and the population was only 45 million.
My colleague Kevin A. Hassett, Hayward and I have also written elsewhere about the problems with cap-and-trade and suggested that a revenue-neutral carbon tax would be preferable, but that, too, represents an effort to impose stasis on a dynamic system simply using more efficient means. A carbon tax is, to be sure, vastly superior to a cap-and-trade system, but there are doubts that it is politically possible to enact one in a way that is actually revenue-neutral and is not abused by politicians who will look to tax those they dislike and rebate the taxes to groups they favor, namely, those which are most inclined to vote for their party.
A more forward-looking, optimistic, and free-market approach to the risks of climate variability accepts that the climate has been, is, and will be variable; focuses on the risks of variability; and looks for ways to build resilience in the face of that change, regardless of cause.
Aaron Wildavsky’s Resilience Paradigm
Aaron Wildavsky, one of the great policy analysts of the late twentieth century, wrote extensively about the benefits of resilient social institutions. [Read more →]
October 23, 2009 4 Comments
[After publication of my New York Times Op-ed on peak oil, Joseph Romm posted a response—and a challenge—on his website, and later expanded it on The Huffington Post. Below is Michael Lynch's response.]
Thank you very much for your invitation to a wager on the price of oil, Joe, which I take to be serious, even though you made no effort to convey the wager to me personally. (If you were simply making a “‘pr” effort, feel free to withdraw it.) I would warn you that for most of my career I have been referred to as a ‘heretic’ or ‘contrarian’ and have repeatedly outperformed other forecasters by explaining (in a number of academic publications) why the forecasting of oil price and supply has been so deficient. That you appear to have been more prescient than me no doubt gives you confidence. But success can be misleading: in the long run, it’s better to be smart than lucky.
Parenthetically, bear in mind that I regard myself as a technocrat, not an ideologue, and my denigration of ‘peak oil’ theory comes from a careful reading of the ‘research’ as well as decades of research on the subject of oil and gas supply. I do not regard oil, gas, solar, nuclear or dung as ‘good’ or ‘bad’ except as the situation dictates.
In fact, for some years, I lived car-free, working and residing in the Boston area, dependent on the subway. Owing to an unfortunate editing error, my op-ed implied that I disapproved of conservation, which is most certainly not the case. Conservation can be one of the cheapest energy ‘sources’ around, depending on the circumstances. (Conservationism is another story.)
Back to 1996
October 21, 2009 9 Comments
Renewable energy generates a larger portion of the world’s electricity each year. But in relative terms, solar power generation is hardly a blip on the energy screen despite its long history of technological development. Solar-generated electricity has one major advantage over it’s more ubiquitous cousin wind power: electricity is generated during typical peak demand hours making this option attractive to utilities that value solar electricity for peak shaving. However, the capital cost of all the solar technologies are about $5,000/kW and higher and projects are moving forward only in particular regions within the U.S. with tough RPS requirements and subsidies from states and the federal government.
In Part I, we reviewed the enormous scale and capital cost considerations of photovoltaic projects and then introduced the standard taxonomy of central solar power generating plants. By far the favored technology for utility-scale projects is the concentrated solar power (CSP) option that either produces thermal energy that produces electricity in the familiar steam turbine process or by concentrating the sun’s thermal energy on an air heat exchanger to produce electricity via a gas turbine. In this Part II, we review a sampling of recent projects. In sum, CSP and Stirling engine technology appears to be favored in the U.S., while the “turbine on a stick” projects are gaining a foothold elsewhere.
The final post will explore the latest developments in hybrid projects that combine many of the available solar energy conversion technologies with conventional fossil-fueled technologies. Hybrid projects offer the opportunity for utilities to reduce fuel costs, while simultaneously helping utilities cope with onerous renewable portfolio mandates.
Pacific Gas and Electric Co. (PG&E) was the most solar-integrated utility in the U.S. last year, followed by Southern California Edison and San Diego Gas & Electric, according to new rankings released earlier this year by the Solar Electric Power Association (SEPA). It’s no great surprise all three utilities serve California residents.
PG&E interconnected 85 MW of new capacity—a number representing 44% of the survey total, the trade group found in its “2008 Top Ten Utility Solar Integration Rankings.” The report surveyed 92 utilities, identifying those that have the most significant amounts of solar electricity integrated into their portfolio. On a cumulative solar megawatt basis, Southern California Edison was ranked first, followed by PG&E, and Nevada utility NV Energy. “This year, the report demonstrated that the utility segment is making a major investment to increase the amount of solar energy in power portfolios, with many utilities doubling the amount of solar power in their portfolio in just one year,” SEPA said. The overall installed solar capacity of the top 10 ranked utilities rose from 711 MW to 882 MW, reflecting a 25% growth. SEPA cited renewable portfolio standards, impending carbon policy, and fluctuating costs of power generation and fuel resources as primary factors driving this growth.
Participating utilities had an average of 11 MW in their cumulative portfolio, and the top 10 utilities represented 93% of all solar capacity. Because of their head start, the large investor-owned utilities in California are likely to retain a lead in the overall cumulative rankings even as the year-to-year rankings shift, SEPA said. [Read more →]
October 20, 2009 1 Comment