A Free-Market Energy Blog

California’s Wrong Debate (Models cannot mask LCFS’s failure in-the-making)

By Tom Tanton -- July 2, 2013

“We should not be using models to ‘validate’ policy and regulations. We should be using the models to better inform policy debates and avoid picking technological winners and (more frequently) losers.”

California’s Global Warming Solutions Act of 2006 (AB 32) put the state on a track rejected by the nation as a whole: a regulatory limit on carbon dioxide (CO2) emissions. This policy, which I have criticized as elitist climate policy postmodernism [1], is an all pain, no gain policy with high implementation costs.

The result of AB 32, California’s Low Carbon Fuel Standard (LCFS), has been debated for six-plus years, including the release of rival studies estimating regulatory impacts. Studies do not debate the climate-change impacts because the answer is … nil.

LCFS requires fuel producers to lower the average carbon content of their products 10 percent by 2020. It is a huge economic variable for the state’s (troubled) economy, and the size of California makes it a national economic issue as well. 

Dueling Studies

A year ago, an oil-industry-backed analysis by Boston Consulting Group estimated that California could lose between 28,000 and 51,000 jobs. The losses included many high-paying skilled manufacturing jobs, as well as indirect job losses due to multiplier effects.

Just last month, a counter study of LCFS was released by ICF that paints a much rosier picture than that of Boston Consulting. The face-value result might not be troubling, but the peculiar assumptions should be. 

Make no mistake: ICF’s study is an interest-group effort underwritten by a group of alternative fuel producers and backers, including the California Electric Transportation Coalition, the California Natural Gas Vehicle Coalition, the National Biodiesel Board, the Advanced Biofuels Association, Environmental Entrepreneurs and sustainable investor group Ceres. As such it should have no particular standing relative to the Boston Consulting study of a year before that was funded by oil-related interests.

ICF’s study purports to confirm what advocates (and beneficiaries) of the LCFS have been saying since the rule was first proposed by then Governor Schwarzenegger in 2006: that the targets can be met with increases in the share of vehicles powered by biofuels, natural gas, electricity and hydrogen, with little economic harm–or even benefit.

“The outcomes are sort of intuitive, but without actual analytics, we can sort of be having this conversation that’s more subjective,” said Eileen Tutt, executive director of CalETC, which paid for the bulk of the $147,000 effort. “Intuitively, I think we all know that being totally dependent on one fuel is bad for us from an economic perspective, but we believe it’s important to show that analytically.”

After my forty years in energy policy, I still find that sort of statement laughable. It is neither intuitive to think that more of the most expensive can somehow lower prices or that ‘analytically’ is ever complete.

While some level of diversification has value as hedge against rising fuel prices, it’s is important to keep two facts in mind. One, too much hedge (insurance) is not economic, Two, hedging by definition also losses the opportunity to take advantage of price decreases. Would we have been better off hedging against rising natural gas prices just a few years ago?

A peer review of the petroleum industry report came out last month, finding that its assumptions of consumer behavior and refiners’ responses were “overly pessimistic.” (Peer review of the ICF study is planned but not yet available–stay tuned). The critics of the Boston Consulting study stated:

[T]he economy has more than ample capacity to absorb even the most pessimistic projections of indirectly displaced workers,” the review said. “… [I]t is neither reasonable nor responsible to suggest that the overwhelming majority of these workers will be unemployed for a significant time.

Again, more than laughable, as the California economy cannot seem to even absorb the 8.6 percent current (fifth highest nationally) and long-term unemployed, to say nothing of the near 25% unemployment among the State’s youth. Nevertheless, I’m not here to argue for one modeling effort or the other, but actually for both but in a different application.

Model Humility Needed

The duel between armies of modelers just illustrates the misuse of economic (or environmental, or global climate) modeling for policymaking. California’s regulators have long used complex computer-based models to examine proposed policies. During the 1980s, they were integral components of Integrated Resource Planning (IRP) for electric utilities under the Public Utilities and Energy commissions.

The hope was that models which captured the complexity of power supply and diversity of conservation choices could shed light on how to provide reliable electric services at the lowest possible cost. Often guided by such models, IRP unfortunately produced choices that made electricity much more costly than necessary and ultimately gave rise to the state’s costly experiment in electric industry restructuring, often misnamed ‘deregulation.’

To achieve the ‘reasonable-cost’ goals of the LCFS (or more broadly AB32), elected officials and other policy makers must understand that a reflexive trust in similar models could have the same consequences for carbon-related policy. It also questions the very goals themselves. Compared to electricity restructuring, failure of fuels policy would be even more catastrophic.

Models are simplified abstractions of energy production and consumption activities, regulatory activities, and producer and consumer behavior. The projections are highly dependent on the data, analytical methodologies, model structures, and specific assumptions used in their development. Trends depicted in the analysis are indicative of tendencies in the real world rather than representations of specific real-world outcomes.

Even where trends are stable and well understood, the projections are subject to uncertainty. Many events that shape energy markets are random and cannot be anticipated, and assumptions concerning future technology characteristics, demographics, and resource availability are necessarily uncertain. Models are often run under different “scenarios” that represent different policies (e.g. cap and trade vs. a fuel tax) or background assumptions (e.g. how quickly sequestration technology might develop).

But the scenarios cannot cope with unthought of, or refusal to accept, future events. Early in the debate about the LCFS, the California Air Resources Board (CARB) officials were warned about the cost and GHG reduction potential from corn-based ethanol, but it still took center stage in the regulations, until overwhelming evidence became available. The models never saw that coming.

The current models are best treated as very hazy crystal balls, and no one should bet their money (or the taxpayers’ money) on coming close to the forecast. It is almost certain that other plausible assumptions and unforeseen events could produce far more optimistic or pessimistic results.

Models are worth using for a very different reason. They force us to organize our thoughts, and their results are starting points for further analysis and contingency planning. Most importantly, they force us to reevaluate assumptions in both data inputs and structure that may have seemed unimportant before we saw the results.

For an example, look at the impact of natural gas prices on the overall economy and energy prices in particular. Much of the regulatory planning LCFS was done way before natural gas prices plummeted with the technological success of hydrofracturing and horizontal drilling, yet the regulations are hardly any different than in say 2009.

Model Assumptions

Looking inside the boxes, we find an assumption that might drive the differing conclusions. Specifically, CalETC/ICF analysis assumes that low-cost technology to produce low-carbon auto fuels will become commercially viable, and continue price improvements. WSPA/Boston Consulting makes no such assumption and instead sees that removing increasing levels of carbon will be more costly after the easy changes have been dealt with.

The point of modeling is not to get an answer. It is to see the questions that matter for further analysis, and to figure out important contingencies.

For example, the high prices and disruptions that plagued California’s power markets in 2001 in very large part resulted from bidding rules that were justified on some models’ forecasts of continuing surplus generation capacity. Had there been contingency arrangements to change those rules in the event of a supply shortage, as some had recommended, the dislocations would not have been anywhere near so severe.

These conflicting modeled outcomes suggest that we change how we use models. We should not be using models to ‘validate’ policy and regulations. We should be using the models to better inform policy debates and avoid picking technological winners and (more frequently) losers.

Should we focus on carbon intensity of the overall economy rather than the slice called ‘transportation fuels’? Aren’t we already better than most when it comes to carbon intensity and efficiency?

Hasn’t that performance come about with less, not more, regulation, relying (now decreasingly) on free-market principles?

Trusting Orwellian statements and believing counter-inutitive model results is not going to make the LCFS road-to-nowhere any less costly. A rethink is in order.

—————–

[1] Also see my previous posts on AB23/CARB/LCFS:

Misdirected Innovation: Environmentalist Taylor on Cap-and-Trade (Part I)

California Cap-and-Trade: Making Ourselves Poorer and ‘Dirtier’ (Part 2)

California Climate Rethink? CARB’s AB 32 Implementation Plan Under Fire

Rent Seeking with Global Warming: From Enron to California AB 32

California’s Economy and Global Warming: Political Morphology

2 Comments


  1. Ed Reid  

    “WSPA/Boston Consulting makes no such assumption and instead sees that removing increasing levels of carbon will be more costly after the easy changes have been dealt with.”

    That approach would seem to comport with both logic and experience. Typically, the cost increases experienced in progressively higher levels of reduction are non-linear.

    I would prefer to spend funds doing RD&D on alternatives which have the potential to be economical, rather than funding commercialization of equipment which does not have that potential. I would also prefer doing so against the backdrop of an ultimate emissions reduction goal, rather than continually chasing incremental reductions.

    Reply

  2. Ray  

    “The point of modeling is not to get an answer.”
    Way back when I was a college student taking a course in numerical analysis, the professor told us students that the purpose of computing was not to obtain numbers but rather insight. The numbers were necessary but they weren’t all that you wanted.

    Reply

Leave a Reply