A free-market energy blog
Random header image... Refresh for more!

Posts from — September 2009

A Death Spiral for Climate Alarmism, Redux?

Desperation is setting in among climate alarmists who by their own math can see that the window is rapidly closing on “saving the planet.”

James Hansen, for instance, said three years ago in the New York Review of Books: “We have at most ten years—not ten years to decide upon action, but ten years to alter fundamentally the trajectory of global greenhouse emissions.” That was also Al Gore’s estimate in “An Inconvenient Truth.” But the time has been ticking away, and it’s increasingly obvious that the Gore/Hansen “wrenching transformation” of the U.S. energy system is simply not going to happen.

Perhaps Copenhagen will make it official.

U.S. cap-and-trade has become a big political liability, in particular, as polls show voters are relatively unconcerned about climate change, and are deeply averse to higher energy prices. That has led Senator John Kerry, for example, to try to hide the ball by changing the name of scheme to “pollution reduction” in order to obscure the reality that it’s basically a massive energy tax. Other Left-leaning politicians (the latest being Houston Mayor Bill White, who is running for the U.S. Senate) are announcing their opposition to cap-and-trade. (1)

Renewable energy is also getting more scrutiny than ever before, awakening not only cost-conscious middle America but grass-roots environmentalists concerned about negative local impacts and big-business intrusion. [Read more →]

September 30, 2009   12 Comments

Horsepower Sure Beats Horses! (Part I: remembering what came before cars–and the failure of the electric vehicle)

The energy policy debate is well informed by history. So many ‘silver bullets’ being proffered by the Obama Brain Trust (‘smartest guys in the room’?) energy interventionists/transformationists are yesterday’s failures. As F. A. Hayek would put it, the Holdren-Chu approach to energy suffers from the ‘fatal conceit’ and cannot expect to be cost-effective in addressing the alleged problem.

Whither the Electric Vehicle

Take the electric vehicle versus the internal combustion engine. The market verdict of a century ago still holds–and for the same reasons. Thomas Edison was correct to pronounce the verdict to Henry Ford in 1896.

Edison himself labored to make batteries more economical for the transportation market, but the problem of weight and poor energy density could not be overcome.  A news splash in 1914 by Ford Motor Company of an “experimental” car, the  “Ford Electric” that would sell for $900 and have a range of 100 miles, based on Edison’s work, described as “Mr. Ford’s personal project” and “experimental” by Ford Motor Company—never got off the ground. Edison’s alkaline battery that penetrated the truck market was rejected by car makers because of its size and an incremental cost of between $200 and $600 per vehicle (1)

So it was back to 1896 for Ford and Edison despite the latter’s $1.5 million effort to commercialize batteries for the car. (2) 

Horse Pollution

Consider horse transportation and what supplanted it.

The quotations below should remind the reader of how big a step it was for transportation to become energized by affordable, plentiful, transportable, dense, reliable energy–and that was petroleum.

“In New York City alone at the turn of the century, horses deposited on the streets every day an estimated 2.5 million pounds of manure and 60,000 gallons of urine, accounting for about two-thirds of the filth that littered the city’s streets. Excreta from horses in the form of dried dust irritated nasal passages and lungs, then became a syrupy mass to wade through and track into the home whenever it rained. New York insurance actuaries had established by the turn of the century that infections diseases, including typhoid fever, we much more frequently contracted by livery stable keepers and employees than by other occupational groups, and an appeal to the Brooklyn Board of Health to investigate resulted in the institution of new municipal regulations on stables, compelling more frequent removal excreta and disinfecting of premises. [Read more →]

September 29, 2009   10 Comments

What Does the Last Decade Tell Us about Global Warming? (Hint: the ‘skeptics' have the momentum)

“Worldwide temperatures haven’t risen much in the past decade…. If you are a climate-change activist pointing to year after year of mounting climate crises, you might want to rethink your approach.”

- Richard Kerr, Science, May 2, 2008.

There has been a flurry of activity in recent weeks in the discussion as to the significance (scientific, political, social) of the evolution of the global average surface temperature during the past 10 years or so.

For those of you who don’t know, the surface temperature of the globe, as a whole, has not warmed-up by anyone’s calculation since at least the turn of the century (January 2001) and depending on your dataset and statistical technique of choice, perhaps as far back as January 1997. And all of this non-warming occurred over a period of time during which the global emissions of CO2 increased faster than ever before (thanks primarily to China). In fact, anthropogenic greenhouse-gas forcing is about 5 percent greater now than a decade ago (about 16 parts per million).

To many folks who have, for years, been fed a constant course of “the-world-is-heating-up-faster-than-ever-before-and-you-are-the-cause,” 9 to 12 years of no warming at all seems to indicate that something is amiss with this mantra.

This was reflected in a Gallup Poll last spring, which found the highest percentage yet of people who think that “global warming” is being “exaggerated.” And this number has been growing.

IPCC “Consensus” and Unwarming

The growth in climate realism (i.e., a realization that alarmists are overplaying the probable impact of CO2 emissions) has most certainly been sparked by the fact that the rate of the earth’s temperature rise has been slowing rather than accelerating, contrary to general IPCC conclusions. This development, naturally, plays into the political debate about (at the 11th hour if not midnight) “mitigating” potential climate change through carbon dioxide emissions reductions. [Read more →]

September 28, 2009   13 Comments

The Global Cooling Scare Revisited (‘Ice Age' Holdren had plenty of company)

“Predictions of future climate trends by Stephen Schneider and other leading climatologists, based on the prevailing knowledge of the atmosphere in the early 1970s, gave more weight to the potential problem of global cooling than it now appears to merit.”

- Paul and Anne Ehrlich, Betrayal of Science and Reason (Washington: Island Press, 1996), p. 34.

Recent attention has been paid to the coming Ice Age talk of John Holdren and Steven Schneider before they got global warming religion.

Here are some “global cooling” quotations and comments from an earlier era. While such concern was not a scientific ‘consensus,’ such as that created by the United Nations’s Intergovernmental Panel on Climate Change in favor of high-sensitivity anthropogenic global warming, the Ice Age scare was a very active hypothesis that should give pause to the Boiling Age purveyors of today.

“Certainly the threat of another ice age was the topic of much scientific and popular discussion in the 1970s. Books and articles entitled ‘The Cooling,’ ‘Blizzard,’ ‘Ice,’ and ‘A Mini Ice Age Could Begin in a Decade,’ abounded. The ‘snow blitz’ theory was popularized on the public television presentation of ‘The Weather Machine’ in 1975. And certainly the winters of the late 1970s were enough to send shivers through our imaginations.”

- Harold Bernard, Jr., The Greenhouse Effect (Cambridge, MA: Ballinger Publishing, 1980), p. 20.

“The worriers about cooling included Science, the most influential scientific journal in the world, quoting an official of the World Meteorological Organization; the National Academy of Sciences worrying about the onset of a 10,000 year ice age; Newsweek warning that food production could be adversely affected within a decade; the New York Times quoting an official of the National Center for Atmospheric Research; and Science Digest, the science periodical with the largest circulation.”

- Julian Simon, “What Does the Future Hold? The Forecast in a Nutshell,” in Simon, ed., The State of Humanity (Cambridge, Mass: Blackwell, 1995), p. 646. 

“In the early 1970s, the northern hemisphere appeared to have been cooling at an alarming rate. There was frequent talk of a new ice age. Books and documentaries appeared, hypothesizing a snowblitz or sporting titles such as The Cooling. Even the CIA got into the act, sponsoring several meetings and writing a controversial report warning of threats to American security from the potential collapse of Third World Governments in the wake of climate change.”

- Stephen Schneider, Global Warming: Are We Entering the Greenhouse Century? (San Francisco: Sierra Club Books, 1989), p. 199. [Read more →]

September 26, 2009   16 Comments

Secretary Chu, Repeat After Me: “Consumers Respond to Price Signals, Not Moral Exhortations” (remember Jimmy Carter?)

Thirty years after President Carter declared that our energy crisis was the “moral equivalent of war,” forever known as “meow,” we are faced with another federal potentate who is sure that he knows what is best for us. At a Smart Grid conference in Washington, D.C., Energy Secretary Stephen Chu opined that “The American public … just like your teenage kids, aren’t acting in a way that they should act.”

Just as President Carter declared that our country’s failure to conserve natural gas and oil was a symptom of a “malaise,” not heaven forefend, the low prices for fuels sold at (federally) regulated prices, so does the current Energy Secretary believe that our citizenry is incapable of making rational decisions about energy use.

Why would the smartest guy in the room (read: central planner) say such a thing–a mistake his press office now says?

Incentives Matter

We consumers respond to economic incentives all the time. If the government offers an incentive to get rid of a car that was already paid for, then we will take the $4,500 and walk away with a new one; when the price of oil and gas rise people put on sweaters and turn down the thermostat, install new windows and think about shorter commutes to work; if the government encourages banks to lend money at very low rates to anyone with a pulse, then people will borrow money to purchase houses they cannot afford; if the government pays companies to generate electricity using wind then they will try to do so regardless of its specific utility in the energy mix. Incentives run the world of personal choices. People can only make rational decisions about the real alternatives that face them, not about some theoretical concerns far in the future.

Obviously, the Secretary thinks he was the only person to grow up in a household where dad told us to turn off lights, shut the refrigerator door, close windows in the winter and other staples of energy-conscious behavior. Only it was not really energy-conscious behavior that motivated dad, it was the gas and electric bills at the end of the month.

I’ve got news for you, Mr. Secretary, a lot of us grew up with this dad, made fun of him at the time for his “light bulb fetish,” and now tell our children exactly the same things (and don’t track mud on the floor, while you’re at it!). [Read more →]

September 25, 2009   3 Comments

Is Joe Romm a 'Global Lukewarmer'?

“On our current emissions path, we’re going to … warm more than 4°C by century’s end.”

- Joseph Romm, Climate Progress, August 11, 2009

I will be happy to bet anyone that the 2010s will be the hottest decade in the temperature record, more than 0.15°C hotter than the hottest decade so far using the NASA GISS dataset.  Any takers? Andy [Revkin]?”

Joseph Romm, Climate Progress, September 22, 2009

In a fit of rage, uber-alarmist Joe Romm of ClimateProgress has recently offered a temperature warming bet that he can win even if more than 85% of all climate models are shown to overpredict future warming.

Has Joe seen the light and become a “lukewarmer”—that is, someone who thinks that the human CO2 emissions will result in only a modest rise in global temperature, somewhere at or below the bottom end of the IPCC range of projections? Might he even be a closet ‘skeptic’–not a skeptic of climate change or anthropogenic climate change, but a doubter of climate alarmism?

For someone so strident on this issue, I would have thought that Joe Romm, would bet on climate catastrophe, not climate-model catastrophe.

Romm issued his bet after hyperventilating about Andy Revkin’s recent article in the New York Times, which suggested that the lack of change in the world’s average surface temperature since the turn of the century (or in some instances, a few years prior) has the potential to make it difficult to get CO2 emissions regulated in the name of “global warming.”

For what it is worth, I agree with Revkin on this (tune in next week to see why), as do an increasing number of science writers who are hedging their bets in line with recent data and what new peer reviewed articles are suggesting. And Romm is furious that Revkin has ‘mainstreamed’ lukewarming in the pages of the newspaper of record, the New York Times.

States Romm: “I will be happy to bet anyone that the 2010s will be the hottest decade in the temperature record, more than 0.15°C hotter than the hottest decade so far using the NASA GISS dataset.”

Well, many climate realists the world over would feel vindicated if the average temperature of the 2010s was only 0.15°C hotter than the decade of the 2000s (the current warmest decade). For that would provide more strong evidence that the earth’s climate was responding to anthropogenic greenhouse gas emissions in a far more benign manner than the ensemble projection of climate models.

In fact, the rejoicing wouldn’t be limited to climate realists but to just about anyone overly concerned about the potential for large negative impacts from climate change given a slower-than-model-predicted evolution of global temperatures. I imagine, though, that Joe Romm would be an exception because while he would have won the bet, he would have lost his Hell-or-High-Water war. He is emotionally attached to the issue with a public record of alarmism that is beginning to put Paul Ehrlich in the shade. (John Holdren, with his billion-death climate scare still on the table, might be another story.) [Read more →]

September 24, 2009   14 Comments

The Federal “Green” Superhighway: 3,000 Miles to Nowhere? (Part II: Obama’s power grab, high cost)

[Yesterday's post discussed how FERC failed to implement the siting authority granted in the Energy Policy Act of 2005 and examined a case study about why it failed. Part II looks at Obama’s “green power” superhighway, the recent work by regional transmission planning organizations to bring renewable energy to market, and the extremely high costs to do so.]

Public policy has long supported the ability to construct new transmission lines that relieve congestion and reduce the cost of energy to consumers. However, it is another question entirely to construct a new “green” coast-to-coast transmission corridor given the mess our transmission system is in today and its prohibitive cost. Critics have complaint that it is throwing good (transmission) money at bad (renewable) generation money.

Slowly, regional system operators are resolving transmission bottlenecks and improving the smooth flow of energy in their service territories. The good news is that virtually all of the most important regional projects are likely to be in-service well before our Washington representatives will complete their transmission siting authority “power grab” (not that it will change their game plan). Also, regional transmission planning organizations are actively promoting and siting transmission lines. The regional system is working and they don’t need FERC or congress to help to fix it.

The local siting processes are working (regardless of how you feel about siting renewables-only transmission lines) but the costs for constructing this transmission is extremely expensive per unit of energy generated given the periodicity of the output from wind and solar power installations. Costs of constructing new transmission for renewable projects can easily equal a quarter of the cost of building the generation alone. In ERCOT, the price is over $2 million per mile to bring renewable energy into the existing grid and will add at least 5 cents/kWh for the transmission portion of the cost of renewable electricity alone—more than double the cost of electricity from our existing fleet of nuclear power plants and 60% more than the cost of coal-fired electricity at the busbar. The Western Interconnect planning process is currently identifying likely renewable sites and looking at transmission line corridors. [Read more →]

September 23, 2009   7 Comments

The Federal 'Green' Super Highway: 3,000 Miles to Nowhere? (Part II: Obama's power grab and high cost)

[Yesterday's post discussed how FERC failed to implement the siting authority granted in the Energy Policy Act of 2005 and examined a case study about why it failed. Part II looks at Obama’s “green power” superhighway, the recent work by regional transmission planning organizations to bring renewable energy to market, and the extremely high costs to do so.]

Public policy has long supported the ability to construct new transmission lines that relieve congestion and reduce the cost of energy to consumers. However, it is another question entirely to construct a new “green” coast-to-coast transmission corridor given the mess our transmission system is in today and its prohibitive cost. Critics have complaint that it is throwing good (transmission) money at bad (renewable) generation money.

Slowly, regional system operators are resolving transmission bottlenecks and improving the smooth flow of energy in their service territories. The good news is that virtually all of the most important regional projects are likely to be in-service well before our Washington representatives will complete their transmission siting authority “power grab” (not that it will change their game plan). Also, regional transmission planning organizations are actively promoting and siting transmission lines. The regional system is working and they don’t need FERC or congress to help to fix it.

The local siting processes are working (regardless of how you feel about siting renewables-only transmission lines) but the costs for constructing this transmission is extremely expensive per unit of energy generated given the periodicity of the output from wind and solar power plants. Costs of constructing new transmission for renewable projects can easily equal a quarter of the cost of building the power plant alone. In ERCOT, the price is over $2 million per mile to bring renewable energy into the existing grid and will add at least 5 cents/kWh for the transmission portion of the cost of renewable electricity alone—more than double the cost of electricity from our existing fleet of nuclear power plants and 60% more than the cost of coal-fired electricity at the busbar. The Western Interconnect planning process is currently identifying likely renewable sites and looking at transmission line corridors.  

 

FERC: Try, Try Again

In Part I we discussed how federal siting authority of new transmission lines was granted under the Energy Policy Act of 2005 (EPAct) yet FERC’s implementation of that authority failed judicial scrutiny. In addition, the case study presented concerning adding an interconnection between Southern California and Arizona clearly shows that there are many other issues that must be considered when establishing the need for FERC to intercede on behalf of one state or another. In my mind, the most significant issue, and the Arizona Corporation Commission agrees, is that a state must completely exercise their ability to construct local power generation facilities before attempting to cross connect to an adjacent state. Merely needing the power is no reason for the federal government to exercise its eminent domain powers when there is an unwillingness to construct new plants.

Today, we now hear the next stanza to this same tired tune. We continue to be told that a complete overhaul of the U.S. power delivery system is required but now the grid updates must also accommodate the higher levels of renewable energy expected to be generated over the next decade. Senator Harry Reid (D-NV) gave us a look at our future when, at a conference in February hosted by the Center for American Progress Action Fund, a group organized by John Podesta, proclaimed, “My legislation (referring to another round of legislation he promised to introduce that will speed approvals of transmission lines) will require the president to designate renewable energy zones with significant clean energy-generating potential.” Reid went on to explain that the federal government should be given the authority, through FERC, to overrule state and local governments that slow the development of Obama’s promised 3,000 miles of new interstate transmission lines.

The proposed legislation would also provide FERC the power of eminent domain should states be unwilling to yield to the inevitable pressure from Washington to approve the plans. “We cannot let 231 state regulators hold up progress,” Reid said. “They should be given every opportunity to see if we can work this out through the state regulators. If that can’t be done I think there are very few alternatives for the American people,” other than eminent domain. But any delays or obstacles would be quickly settled, Reid said. “Whatever we pass at the federal level trumps all that,” he said.

John Podesta, president of the Center for American Progress, said a stronger federal siting authority is needed, given that the 4th U.S. Circuit Court of Appeals ruled that FERC’s interpretation of its backstop siting authority under the 2005 energy bill was too expansive.

“It’s time to get back to the table and find a way so that states and regions can plan for the transmission that they need but that the federal government has a role to play to make sure that gets done,” Podesta said.

Reid has yet to provide any details of his proposed bill but a legislative aid said the bill would contain four main components: an interregional planning component, federal siting authority, a national cost allocation plan and a requirement that any generation that connects to the grid meet “green” standards. The four parts appear very similar to a plan produced by the Energy Future Coalition and the Center for American Progress.

Thankfully, Reid’s proposed legislation has yet to see the light of day given the extraordinary costs involved with constructing new national interstate transmission lines. For example, grid operators in the eastern half of the U.S. earlier released in August a study estimating that more than $80 billion in new transmission infrastructure would be needed to get 20% percent of the region’s electricity from wind generation by 2024.

Does Siting need Fixing?

The Federal Energy Regulatory Commission (FERC) recognizes the challenges posed by bringing electrons from new and disparately located renewable energy sources to population centers. In late May, FERC announced a series of transmission planning meetings that will focus on “wider integration of regional energy resources into the nation’s power grid.” In essence, renewable energy generation, principally wind energy, is located where the transmission infrastructure does not exist, and other distributed energy resources are located in transmission-constrained regions.

According to FERC Chairman Jon Wellinghoff, “Planning is one of the three legs on the transmission policy stool—the others are siting and cost allocation—and all are crucial to meeting the goals of assimilating demand resources, renewable energy and distributed generation into the grid for the benefit of consumers.” Here we go again.

From Market Pull to Product Push

Historically, electric utilities dictated when, where, and how much new generation would be added. Their integrated resource plans (IRP) determined the timing of plant additions, the fuel sources, and the location of the new generation resources. Transmission planners followed the lead of utilities to route the necessary transmission capacity while also seeking to lessen area congestion, if necessary. Traditionally, new power generation resources—and, by extension, new transmission—responded to a market pull: predicted load demand. The role of the state and local governments was oversight, providing access to transmission, and setting rates.

In contrast, renewable mandates have upended the traditional approach to developing an IRP. Rather than anticipated customer demand driving generation and transmission decisions, government mandates are now in the driver’s seat. Twenty-nine states and the District of Columbia have a renewable portfolio standard that requires utilities in those states to supply some percentage of renewable electricity by a date certain.

For instance, the California Public Utility Commission requires that 33% of that state’s power originate from renewable energy sources by 2020. In order to achieve this extraordinary goal, all new power generation procured by the state’s utilities must come from renewable energy sources. In this new world, the “pull” of market demand has been supplanted by a government-mandated “technology push” that determines which renewable developers pushing new power into the system in response to state-mandated levels of renewable power have access to limited transmission infrastructure.

One of the other challenges to building new transmission capacity to move renewable energy long distances that was discussed by Wellinghoff is identifying acceptable siting locations for renewable energy facilities. In spite of FERC’s interest is being part of that decision process, much progress has been made at the local level that makes FERC irrelevant in siting transmission lines in practice.

Transmission Planning Out West

One important initiative toward this goal in the Western Interconnection is the Western Governor’s Association’s (WGA) Western Renewable Energy Zones (WREZ) study. In the WREZ study—which covers 11 western states, two Canadian provinces, and areas of Mexico that are part of the Western Interconnection—as many as 50 zones with substantial renewable resources are in the process of being identified so that renewable projects can be expedited and transmission projects can be planned in advance.

The ultimate goal of the WGA is to “develop 30,000 MW of clean and diversified energy by 2015.” The approach used by WGA is to first identify regions with high potential for generating renewable energy—solar, wind, geothermal, etc by involving all the relevant stakeholders. The results of these studies in turn drive transmission planning.

The most recent draft map from the Western Governor’s Association illustrates Qualified Resource Areas as those areas with a high density of developable renewable energy resources after screening for known technical and environmental limitations for which data are available. These data will be used to determine Western Renewable Energy Zones (WREZ) in the Western Interconnection.

The state with the largest installed wind power capacity has already identified Competitive Renewable Energy Zones (CREZ) within the ERCOT Interconnection. In March, the Texas PUC assigned approximately $5 billion of transmission projects to be constructed in these CREZ that will eventually transmit 18,456 MW of wind power over more than 2,300 miles of new transmission lines from power-heavy West Texas and the Panhandle to highly populated metropolitan areas of the state. To put the magnitude of these numbers into perspective, the cost of transmission is over $2,000,000 per mile or over $270/kW of installed capacity.

The regulatory body expects that the new lines will be in service within four or five years. The Texas PUC took about three years to select the most productive wind zones in the state, designate them as CREZ, and devise a transmission plan to move power generated from those zones to various populated areas in the state. Many of these new transmission projects will begin construction later this year.

As an aside, T. Boone Pickens’ investment in his now delayed plan to build 1,000 MW of wind power in the Texas panhandle is in jeopardy. The ERCOT transmission plans do not extend the wires far enough into the Panhandle to reach Pickens’ projects. Pickens now has 687 wind turbines available that cost him a cool $2 billion that he hopes to recycle on a number of smaller projects in the U.S. and Canada. That’s a lot of wind turbines.

The Cost of New Transmission Is Substantial

More insidious are unpredictable transmission costs. Power sellers, buyers, and investors adamantly want price certainty in the total delivered cost. However, congestion charges can make the delivered price vary, especially in locational marginal pricing.

Everyone wants to know the answer to the question: What is the added premium to deliver renewable energy? Many transmission networks have both fossil fuel and renewable generators sharing the same network. Certainly, intermittent renewable sources have higher system-integration costs. Load balancing is more involved as well.

A recent Lawrence Berkeley National Laboratory study may provide an early answer to the cost question. Lawrence Berkeley National Laboratory (LBNL) recently issued a research report that examines the expected costs for new transmission infrastructure that would be needed to support an accelerated program for renewable energy projects, particularly wind energy. The report, “The Cost of Transmission for Wind Energy: A Review of Transmission Planning Studies” was released in February 2009. (A copy of the report can be downloaded at http://eetd.lbl.gov/ea/ems/reports/lbnl-1471e.pdf.)

The authors’ objectives in preparing this report were threefold: to define the transmission costs for a rapidly growing wind power industry, to discuss different transmission planning approaches, and to examine the models used to estimate future wind deployment. Our interest is this article is to focus on the transmission cost estimates prepared by LBNL.

The cost estimates are based on a review of 40 transmission planning studies completed between 2001 and 2008 by various developers, independent system operators/regional transmission operators, state agencies, and individual utilities. There is a wide range in transmission costs, although the costs are generally less than $500/kW. The cost of the median study scenario was $300/kW, or about 15% to 23% of the typical installed cost of a wind turbine plant. These numbers are quite consistent with the $270/kW from ERCOT discussed above.

The authors also concluded that variation in the study methodologies used in these 40 transmission siting studies and the characteristics of the specific grid may affect transmission installation costs (see table). Depending on the original purpose of the transmission line under study (whether it was congestion or deliverability focused), the authors concluded that the purpose affected the costs of adding wind energy to the mix.

Estimated Installed Cost of Wind Transmission Based on Three Higher-Level Studies of Wind Transmission. Source: LBNL

Study

Wind Capacity

Unit Cost of Transmission for Wind Power

10% Wind Energy by 2030: AEP 765 kV Overlay Study

200-400 GW

$150-$300/kW

20% Wind Energy by 2030: Wind Deployment System

290 GW

$207/kW

Annual Energy Outlook 2008 Projections for 2030: National Energy Modeling System

40 GW

$450/kW consisting of $316/kW for transmission and $133/kW for “long-term” multipliers

The study also reviewed three high-level wind transmission–only studies, as shown in the table above. These costs are generally consistent with the median cost identified in the original study sample of $500/kW, or about 25% of the $2,000/kW cost of constructing a new wind project.

The study also concluded that the historic cost of transmission was in the range of $35/MWh to $79/MWh with an average of $45/MWh. Using reasonable economic assumptions on the ERCOT transmission projects and a 33% capacity factor, the transmission lines add about $50/MWh to the price of power generated by the wind projects. For perspective, existing nuclear plants as an industry deliver power to the grid at less than $20/MWh and coal plants are in the range of $30 MWh.

Another Approach: Requiring Backup Power

Nevertheless, renewables do add additional costs to the whole system. For instance, speedy ramp-up of backup power is essential when a wind farm goes down with as little as one-hour warning. Reliability issues kick in as well.

For example, an ERCOT report concluded that only 8.7% of historic wind generation was produced during peak power hours limiting its effectiveness in trimming system peak demand.

Someplace in the delivery chain this intermittency of energy production versus load demand must be smoothed out. Utilities traditionally have taken on this burden themselves. Typically, a utility backfills wind/solar gaps with gas-fired plants to make up for any shortfall in energy production based on a number of factors, including the season, weather, and the region’s operating experience. Using the same approach with very remote wind and solar farms isn’t as straightforward. To do so would make the entire long-distance energy delivery chain, in effect, run intermittently—if the remediating, balancing measures are not applied.

A more recent procurement practice is for the electric utility to insist that the renewable producer directly supply steady, baseload-style power. In particular the utility expects the renewable power producer to have its own storage or natural gas backups. An example would be Xcel Energy’s April 2009 request for proposal for 600 MW of solar thermal that is “fortified” in this way.

Conclusion

Central planning based on temporary political majorities–or, dare one say, ‘political whim’–is not a viable long-term electricity policy. Free-market incentives to expand and build are preferable, and do not expect a 3,000-mile ‘green’ superhighway as a result.

— Also contributing to this article was Sonal Patel, POWER senior writer, and Martin Piszczalski (Ph.D), an industry analyst with Sextant Research

September 23, 2009   No Comments

The Federal 'Green' Superhighway: 3,000 Miles to Nowhere? (Part I: siting politics and state wealth transfers)

Investment in interstate transmission has not kept pace with the need for more electricity capacity, despite wakeup calls such as the widespread Northeast and Midwest blackout in August 2003. Transmission siting authority has become the mantra for those who claim that the “not in my backyard” (NIMBY) syndrome is driving U.S. energy policy. FERC was given the opportunity to flex their national siting authority muscle with passage of the Energy Policy Act of 2005 (EPAct), but their game plan failed to pass court scrutiny. Today, siting new transmission remains a state’s rights issue as it has always been.

Transmission siting controversies are increasing given the growing number of renewable energy projects that want to interconnect with scarce transmission capacity. Now, another layer of complexity is in play due to the potential of a national renewable portfolio standard that portends hundreds if not thousands of new renewable projects that will all seek priority for grid access.

There are new renewable projects in development today that are already in the queue waiting for transmission capacity on existing lines or the construction of new lines because of the prohibitive costs of transmission upgrades. Other projects are so remote that only a purpose-built transmission line can bring the energy to market.

Adding uncertainty to uncertainty, Congressional leaders have proposed constructing new transmission lines dedicated to moving only renewable energy coast-to-coast whereby state’s rights will be of secondary importance. Regardless, ratepayers will end up paying the tens of billions of dollars for these new lines and further driving up the cost of electricity.

Below, we discuss how FERC failed to implement the siting authority granted in the Energy Policy Act of 2005 and provide a case study on the reasons for failure. Part II (tomorrow) will look at the latest rendition of the siting authority power grab: Obama’s promise of a 3,000 mile coast-to-coast “green power” superhighway. We’ll also discuss the recent work by regional transmission planning organizations to bring renewable energy to market and the costs to do so. It won’t come as a surprise that the costs are extremely high.

 

 

New Transmission Sites May Take Years

Siting new transmission lines is an exercise in patience and endurance. The industry has plenty of war stories about state, county or local authorities being unable or unwilling to approve new transmission projects, especially projects that merely transit through a state to get energy to an out-of-state market. One of the long running and always contentious debates is the resistance of Connecticut residents to allow transmission of electricity generated upstate to pass that power through to New York City.

One of the most egregious examples of how a project can become a career in my memory is AEP’s Wyoming-Jackson Ferry line completed in 2006—16 years after the project’s launch, 14 of which were spent wrangling over siting.

Another example; a plan hatched in the late 1980s to move surplus power from coal power-rich West Virginia to power-short New Jersey and New York crashed in the early 1990s due to the opposition of Pennsylvania. Delegates from the Keystone State asked, appropriately, “What’s in it for us?” The answer: “Not much.” Pennsylvania responded, “No thanks.” End of project, after nearly a decade of contention.

Congress has made numerous attempts to reduce these delays and shorten the time required to add new transmission capacity where it is most needed. And each time the new laws have failed miserably.

The most recent attempt was provisions in EPAct that gave FERC the authority to override state and local opposition to the construction of interstate transmission lines if the agency determines that they will reduce system congestion. In April 2007, the Department of Energy designated two regions that qualify for such treatment as “national interest electric transmission corridors.” One covers a broad territory stretching from Maryland to New York and as far west as Ohio. The other includes a large chunk of southern California, southern Nevada (including Las Vegas), and parts of Arizona all the way to Phoenix.

In announcing FERC’s plan at the time, then-Energy Secretary Samuel Bodman said, “The parochial interests that shaped energy policy in the 20th century will no longer work.” Maybe so, but instead of serving the “national interest,” the proposed corridors look a lot like a lot of poaching routes to me. Their result was said to enable regions that have resisted building generation locally—in hopes of buying cheaper power from other regions—to avoid paying the full costs of “their” power. The problem with this plan is that it saddles out-of-state generating regions with the environmental and lost-resources costs and consequences.

California Dreamin’

The Sunrise Powerlink is certainly the most ambitious project developed by San Diego Gas & Electric (SDG&E) in many years. The recently approved project is running a new 150-mile transmission line east from San Diego into the deserts. SDG&E claims the link will spur development of renewables (geothermal and solar), lower transmission system congestion costs, and “reduce subsidies paid to local, aging power plants that are more expensive to operate.”

The second and third justifications are closely related and often given short shrift by the media. In the report backing its national corridor designations, the DOE states that one of the biggest reasons it considers the southern California grid “troubled” is the high cost of running those old plants in California—typically gas-fired units in urban areas. No surprise: SDG&E hasn’t built a new power plant in its service territory in more than 30 years.

In the interest of full disclosure, I worked on the design and construction of SDG&E’s last major power plant project Encina Unit 5 when it was constructed in 1976-78. I was also present when then-CEO Tom Page announced in 1978 that SDG&E was not going to construct any more plants but would become a “wires” company and in the future import energy from other sources rather than construct any new plants. To do so was a pure business decision made at the time given the resistance of local governments and citizens to building any more power plants. This business plan was a conscious decision to avoid building local generation and rely solely on imported energy to cover load growth. To their credit, the plan has worked for over 30 years, but now Arizona has the surplus power capacity and a growing population and is unwilling to share their electricity resources. This is a game changer for Southern California.

Buddy, Can You Spare a Megawatt?

I recently drove my 4WD truck to the top of a small mountain just south of the Palo Verde Nuclear Generating Station in Arizona, about 100 miles from the California border. The power park view warms a power engineer’s heart: The three-unit 3,739-MW nuke lies to the north, and to the south sit Sempra’s 1,250-MW Mesquite Generating Station, Pinnacle West Energy’s 1,136-MW Redhawk Power Plant, and LS Power’s 713-MW Arlington Valley Energy Facility. Travel over the next hill and south a few miles and you’ll find the 2,400-MW Panda Gila River Project. All but Palo Verde are gas-fired combined-cycle plants.

I’d venture that a big chunk of the more than 9,200 MW is just aching to find a path into Southern California. Why? The average retail electricity price is almost 50% higher there than in Arizona: 15 cents/kWh vs. 8.5 cents/kWh. Finding new customers willing to pay more for the same commodity would be a business coup for the plants’ owners. The only hurdle would be permitting the new lines needed to deliver it. But the Arizona Corporation Commission (ACC) has a different vision of how that power will be used—generate it in Arizona, use it in Arizona.

To its credit, the ACC saw through the “master plan” scheme and unanimously voted down a Southern California Edison (SCE) proposal to build a 231-mile transmission line to connect the power park to a substation near Palm Springs (and ultimately to the Sunrise Powerlink and points west). SCE execs argued that the link would “increase the state’s ability to transmit energy.” What they really meant was the ability to transmit energy to Southern California. Bill Mundell, an Arizona energy commissioner, explained the rejection succinctly at the time: “I don’t want Arizona to be the energy farm for California.” Commissioner Kris Mayes added, “You [SCE] are trying to drop a giant extension cord into Arizona.”

The final meeting on the project turned a bit testy when commissioners quizzed Dian Grueneich of the California Public Utilities Commission about her state’s recent lack of progress building new power plants and transmission lines. Mundell asked, “Why should Arizona put its natural resources, environment, and future energy supply on the line while California does relatively little?”

That’s indeed what California is asking Arizona to do, in an updated version of Aesop’s fable of the grasshopper and the ant. And that’s precisely why the entire national interest corridors program will face challenges from power “donor states.” In California’s case, taxing that nasty coal-fired power coming into the state up north and strong-arming its neighbor to the east for a bigger slice of existing gas-fired capacity is an energy plan doomed to failure. Nevada, keep an eye on these guys.

New Developments

Unable to convince Arizona’s utility commissioners to approve the new power line, the California utilities did just as was expected: ask FERC to step in and exercise some of that authority vested in them by EPAct to make Arizona plug in their extension line. FERC tried to mediate the disagreement with no success and subsequently developed an order forcing Arizona to agree to the interconnection. The inevitable federal court case resulted.

In February 2009, a federal appeals court slapped FERC’s hand for overreaching the authority granted to the agency by EPAct when it took an “expansive interpretation” of the law in asserting its power to override state decisions.

The U.S. Fourth Circuit Court of Appeals in Richmond, Va., issued its decision in a case brought against the regulatory commission by the Piedmont Environmental Council and multiple states and parties—including the New York Public Service Commission (PSC) and the Minnesota Public Utilities Commission (PUC).

At the heart of the matter was the authority granted by EPAct, which allowed the commission to approve interstate power lines after the affected state had “withheld approval for more than a year.” But in an issuance of a final November 2006 rule, FERC substantively interpreted the phrase, “withheld approval for more than one year” to include a state’s denial of a permit within the one-year statutory timeframe.

The petitioners had filed requests for rehearing on FERC’s final rule, arguing that the agency had erred in its interpretation. The parties also asked the court to review several rulemaking decisions FERC had made with the application of that interpretation.

“FERC’s interpretation is contrary to the plain meaning of the statute,” wrote Judge Blane Michael for the majority. “Simply put, the statute does not give FERC permitting authority when a state has affirmatively denied a permit application within a one-year deadline.”

Michael said that FERC’s standing interpretation would mean that state commissions would lose jurisdiction unless they approved every permit application in a national interest corridor. “Under such a reading it would be futile for a state commission to deny a permit based on traditional considerations like cost and benefit, land use and environmental impacts, and health and safety. It would be futile, in other words, for a commission to do its normal work,” he wrote.

The court’s decision now sets hurdles for FERC-approved projects whose public commissions have issued denials but that hasn’t slowed down the pressure to overhaul (again) the provision of the EPAct failed to pass judicial scrutiny.

In essence, FERC powers granted under EPAct were neutralized by the appeals court’s decision and Arizona’s rejection of the construction of a new transmission line stands.

But, this court decision doesn’t bring an end to the push for federalizing transmission siting authority. Far from it. The next page of the game plan uses the supposed need for a “green” coast-to-coast transmission superhighway as cover for nationalizing all future transmission siting decisions.

— Also contributing to this article was Sonal Patel, POWER senior writer

September 22, 2009   5 Comments

Running Into Oil

“Some commentators hope that new technology will lead to important deepwater finds.  Some new deepwater areas with giant potential, such as the Perdido Trend in the western Gulf of Mexico, will no doubt be found, but generally, the geology of most deepwater tracts is not very promising.” 

- Colin Campbell (founder: Association for the Study of Peak Oil),  Noroil, December 1989. 

The past week was a bad one for peak oil enthusiasts, as three separate announcements indicated the abundance of undiscovered petroleum.

First, BP announced that it has found a field in the Lower Tertiary basin in the deepwater Gulf of Mexico, named Tiber, containing something on the order of 3 billion barrels.

Next, Petrobras announced another discovery in the pre-salt basin, this one Guara, containing about 1 billion barrels of recoverable oil.

And in the Bakken Shale, a new zone was proven to be productive and possibly capable of producing another billion or so barrels.

While some (like Matt Simmons and Jeremy Leggett) have pointed to these developments as evidence of the ‘need’ (sic) for going to extremes to find more resources, in fact it highlights the manner in which better technology and knowledge are making previously uneconomic resources viable.

Others have argued that the discovery size of ‘only’ (sic) three billion barrels suggests resource scarcity, since that amount represents only a couple of weeks of consumption. Which is a typical context-free remark: few discoveries amount to more than a small portion of the resource. The super-giant Prudhoe Bay field, for instance, only represents about six months of global oil consumption.  During the 1980s and 1990s, large amounts of production came online around the world in areas like Yemen, Oman, Colombia, where the discoveries were smaller than what is now being found.

More important, each of these recent developments shows progress in a new geological area, and progress can be expected to advance sharply as activity increases and knowledge improves.  This is only the beginning of a long process of exploitation which will see large-scale resources developed; any given field typically represents only ten percent or so of the play’s resources.

But it also discredits the oft-repeated argument that there are no new plays left.  Years ago, when new technology made it possible to perform seismic studies of the large subsalt area of the Gulf of Mexico, I noted to my mentor M. A. Adelman that this was being described as the ‘last new play’ to be exploited. I asked him how many times he’d heard that before.  He said, “all my career” (which began in the 1950s).  Since that time, new plays included the presalt in Brazil and the Lower Tertiary in the Gulf of Mexico.

The assertion by resource pessimists that there are no new areas is based on the false assumption that geological knowledge of the Earth is so extensive that every area must be already identified.  But in fact, while most are identified, at least in a general way, many remain untested for the simple reason that they are difficult to access for political, legal or geographical reasons. 

The recent discoveries in east Africa are a clear demonstration of this:  the government of Uganda had simply been too preoccupied with its political troubles to seek exploration until recently.

The reality remains that the conventional petroleum resource appears so extensive as to satisfy the world’s needs for decades to come, with the primary obstacle being the imagination of the pessimists. And with enough statism and dulled minds in place of vibrant free-market entrepreneurship, the pessimists will be ‘right.’ As David Osterfield wrote: “Perhaps the kernel of truth in the catastrophist position is that a completely closed or controlled society would, in fact, face the ominous prospect of resource depletion.”(1) Well said.

(1) David Osterfeld, Prosperity Versus Planning:  How Government Stifles Economic Growth (New York:  Oxford University Press, 1992), p. 102.

September 21, 2009   9 Comments