The Internet has been burning up these last two days with reactions to a new academic working paper (Do Energy Efficiency Investments Deliver? Evidence from the Weatherization Assistance Program) by researchers at the Energy Policy Institute at the University of Chicago (EPIC) and the University of California, Berkeley, associated with the E2e Project.
Let me be blunt and to the point. The “results” of this very narrowly focused and arguably conceptually flawed study are being blown out of proportion, with many news article headlines taking this one example as representative of all residential energy efficiency programs. Unfortunately, this flawed conclusion has been promoted by the Energy Policy Institute themselves in their press release and accompanying policy brief.
For those not yet familiar with this story, the authors conducted a study of one particular low-income program (the federal Weatherization Assistance Program, or WAP), as implemented in portions of one state (Michigan), and somehow ended up with the sweeping headline “Study Finds Costs of Residential Energy Efficiency Investments are Double the Benefits.”
Some of the popular press is already picking up on this theme, and the concern is that a misunderstanding (or misuse) of this study will lead to low-income families having less access to important programs that drive down their utility bills. Or worse yet, as a broad-brush attack on all types of energy efficiency programs.
Evaluation wonks will be able to point to several minor to moderate problems with the study’s assumptions and calculations. But in the interest of time, let me focus on two fundamental flaws in the study and how the results are being “spun.”
Knocking down a straw man
First, the authors create a “straw man” implication that somehow the WAP is expected to be cost effective by simply comparing the direct energy savings to the total project costs to serve the home. In reality, no knowledgeable expert in this field expects the WAP (or any typical low-income program operated by utility companies) to be cost-effective solely on the basis of direct energy savings. States commonly exempt low-income programs from the usual cost-effectiveness tests. This is in part due to the very poor condition of the housing stock, and the major costs involved in upgrading the housing. It is also in recognition of the special needs of that target population, and that they have no discretionary income to devote to energy improvements, leaving the program to pay the entire costs.
As a result, these programs are typically judged by including the associated “non-energy benefits” in addition to the direct energy savings. These multiple benefits include things like the effects on comfort, health, safety (e.g., WAP typically installs smoke detectors, CO detectors, fixes wiring problems, fixes gas leaks, etc.), increased value of the improved housing stock, reduced utility-bill payment arrearages and non-payment collection costs (which saves money for all ratepayers), improved ability to remain in the dwelling and not have to be relocated, etc.
Studies exist in the industry that quantify these types of variables, and when taken in aggregate, the non-energy benefits’ value can nearly equal, or even exceed, the direct energy savings value. DOE’s last meta-evaluation of the Weatherization Assistance Program found direct energy savings averaging $3,917 and non-energy benefits valued at $3,466, or nearly as much as the direct energy savings benefits). Viewed in that comprehensive manner, programs like WAP are cost effective from that broader societal perspective—as a public policy, they make sense.
The study released Tuesday simply ignores those other multiple benefits, and does not quantify them in the analysis. This methodological flaw pre-ordains the conclusion.
The straw man problem is compounded when the report suggests that WAP fails as a “means to fight climate change.” While the program does produce some CO 2 reduction benefits, these are just a bit of frosting on the benefits cake. No one would suggest WAP should be considered as entirely, or even primarily, a mechanism to fight climate change. Yet the study disingenuously reports that the cost of WAP as a carbon reduction strategy is $329 per ton, by loading all the costs of the program onto the CO2 benefit as though there were no other benefits, and that the program was only being done to reduce CO2—a ridiculous premise.
The second (and unforgivable) fundamental flaw is that the study generalizes from an extremely limited sample. From one sample in one state, EPIC makes the leap to claiming that seemingly all “residential energy efficiency investments are double the benefit.”
In contrast, several national studies have examined the costs of residential energy efficiency programs across dozens of states and have found them to be highly cost effective; however, low-income programs have higher costs on a per-kWh basis. For example, in a recent study, Lawrence Berkeley National Laboratory (LBNL) finds that the cost of low income programs average 14 cents per kWh, compared to all residential programs at 3.3 cents per kWh. ACEEE’s most recent review of energy efficiency program costs similarly found that average cost per saved kWh from residential and low-income programs combined across 9 states was 3.7 cents/kWh. This is less than half the cost of electricity from a new power plant, and obviously very cost effective. And without including any monetized value for CO2 reductions, the CO2 reductions are essentially a "free" extra benefit.
In short, this study cherry-picked the worst possible program for comparing total costs to just direct energy savings, then set up a straw man to knock down, then tried to suggest, from an extremely limited sample of one program type, that all “residential energy efficiency investments” are suspect.
I should note one additional factor that really exacerbates the “cherry-picked” issue, and which I’ve not seen mentioned in any other discussion. This study looked at the WAP during the “stimulus package” years. As someone who was directly involved with one of the biggest “Better Buildings” pilots in the nation during that time period, I can tell you that the WAP was extremely stressed at the time, with tremendous pressure to push money out to the field. Job creation was at least as big a goal as energy savings, and they were functioning with a lot of new and inexperienced employees in order to handle the huge increase in funding and the deadlines to get it spent. I was there, on the ground, in Michigan at the time, and I know for a fact that the stimulus package demands on the system drove up the average cost per home in the program, in an effort to “get the money spent.” This would naturally tend to diminish the apparent cost effectiveness in terms of energy savings per dollar spent. This “stimulus package” distortion of the WAP during the years of the study further discredits any leap to generalize results to the “normal” program, much less to other examples of “residential energy efficiency investments.”
The real motive behind the study
Given the affiliation of the authors with E2e, one suspects that an important motive at play here is to make the case for the neo-classical economists’ preferred climate policy: so-called “market-based” approaches such as a carbon tax. Indeed, they explicitly argue for that as an alternate policy approach in both their press release and the accompanying policy brief. ACEEE supports the concept of a carbon tax. But it should be seen as a complement to, not a replacement for, traditional energy efficiency programs.
Viewed in the proper context, this new report could be seen as an interesting study to add to the large volume of energy efficiency program evaluations conducted over the years. The study does identify some problems with this particular program, but they can be easily addressed. For example, the energy audits conducted do not appear to have been calibrated with actual energy usage for each home, and thus baseline energy use and the amount of energy saved were overestimated. A study in New York found that such calibration, among other steps, improved the accuracy of energy savings estimates from 60% to 90%.
However, if taken out of context and generalized way beyond any justification, this new E2e working paper could be misused to attack critically important energy efficiency policies and programs. The data on the cost-effectiveness of residential energy efficiency programs are robust and extensively documented. Hopefully, well-informed policymakers and reporters will prevent any misuse of the study.
It should be noted that one of the co-authors of the study, in an interview with the Washington Post, properly acknowledged that the study results should not be generalized. Meredith Fowlie, an associate professor of economics at the University of California, Berkeley, was quoted by the Post as saying: “This is one study in one state looking at one subpopulation and one type of measure,” she says. “I would not feel comfortable generalizing from our study in Michigan.”