The Siren Call of the Negawatt. Justifying Ratepayer Funded Energy Efficiency Schemes

The energy industry has got a new religion – that of the Negawatt. Over the past decade it’s gone from a small following to becoming the new Messiah of capacity planning. It’s one of the few things where utilities and regulators come to worship at the same shrine. In fact they like it so much, they’re happy for an increasing number of consumers to pay for it. The only problem is that like other faith based beliefs, no-one really knows whether it exists or if it delivers what it promises.

If the negawatt is new to you, it’s a very neat scheme. The theory goes that if you can persuade consumers to use less energy, then you need to build fewer new power stations. Each kWh of energy saved is a negawatt (suggesting its name was coined by a marketing person, rather than someone who understood the difference between power and energy). Hence each negawatt achieved means less generating capacity is required to support demand. As power stations are expensive to build and operate, lots of negawatts are an attractive prospect as they represent an effective reduction in the need for new power stations, or the opportunity to put off building them. In other words negawatts mean utilities save money, which should be reflected in lower energy bills for consumers.

As it appears that this is so obviously a win-win concept, regulators have increasingly been willing to support what are called Ratepayer Funded Energy Efficiency Schemes. These allow energy suppliers to increase the cost a user pays for a unit of energy on the condition that the suppliers use this extra revenue to promote energy efficiency schemes which reduce consumption. If that sounds a bit Ponzi-like, it is. But in the short term, if it works, everyone should win. Consumers pay more per unit of electricity, but use less, so save overall. Utilities need to invest less, so future energy price rises should be contained, but the rate increase keeps their profits up. And as less energy is used, CO2 emissions are reduced, keeping the regulator and Mother Earth happy.

But does it work?  In areas where these schemes operate, consumption has gone down, so negawatt proponents claim it’s effective. But energy usage has also gone down elsewhere. Which begs the question of whether we are measuring a real effect or not?  Has the siren song of the negawatt befuddled both utilities and regulators, removing their ability to make rational judgements?  In the US, billions of dollars are being spent on these schemes, whilst in the UK, DECC has built its justification for smart metering on the same unproven promise of jam tomorrow. So how do we separate belief from reality?

Let’s start with the history. Ratepayer Funded Energy Efficiency Schemes originated in the US. They predated and are separate to the recent stimulus funding which poured around $83 billion of taxpayer’s money into the energy industry. They were based on the observation that household energy use was rising year on year, as consumers purchased more energy-hungry appliances and became less tolerant of climate variation, swapping seasonal sweaters and shorts for heating and air-con. Analysts predicted that energy usage would continue to rise inexorably as TV’s grew bigger and colonised every room in the house, and as we ultimately traded in our internal combustion engines for electric cars. It was a backdrop that was deeply worrying for the energy industry, as it didn’t just call for more generating capacity, but put more load on a US grid that hadn’t been significantly upgraded since the last tranche of Government stimulus during the 1930’s depression.

The prospect of reducing or curtailing this ever increasing demand and being paid to promote that restraint was music to the ears of energy suppliers and power generators. It also played to the emerging green credentials of some of the more energy profligate states. California embraced the concept, requiring public utilities to acquire first all available energy efficiency savings. Massachusetts and other followed suit, enforcing the rule of efficiency before new generation. Hence the concept of Ratepayer Funded Energy Efficiency programs was formed. Whereas utilities had previously been allowed to increase consumer bills to pay for new generation capability, they were now allowed to raise bills if they used that money to promote energy efficiency.

It’s interesting to extrapolate the principle to the extreme. In that case we’d all reduce our energy consumption to almost nothing, but still pay energy suppliers the same amount as we do today, just to keep them in the business to which they’re accustomed. Which suggests there may be a flaw in the scheme.

Last year US consumers paid almost $6.8 billion to fund these schemes. It’s forecast that by 2020, that figure will have doubled. Which means that over the next decade US taxpayers will be contributing around $100 billion into energy efficiency schemes via their utility bills. That’s more than the amount of stimulus funding which went into the smart grid. But energy efficiency tends to be a game of diminishing returns. There are some easy gains to be made at the start, but subsequent ones become increasingly smaller. The concern is that utilities are so captivated by the scheme that they’re not looking at the detail. And given regulators seem happy to increase the ratepayer levy each year, why should utilities bother to see if it still makes sense?

There’s increasing evidence that it doesn’t make sense. Utilities running energy efficiency programs claim that household usage is decreasing as a result. But the same decrease is also evident in territories that don’t promote energy efficiency. The fact is that appliance manufacturers have also been beaten up by governments and many of them are now selling new products on the basis of their energy efficiency features. Five years ago large plasma TVs soaked up hundreds of Watts. Today LED televisions consume little over 100W. By 2018 they’re expected to use less than 60W for a 42″ screen. Their standby power has also shrunk from tens of Watts to less than a Watt in many cases. The same is true of chargers, which now take less than a quarter of a Watt when the item they’re charging is fully charged or disconnected. The old cry of “vampire power”, where unused devices soaked up hundreds of dollars of energy each year is becoming a historic myth. Not that that stops companies selling expensive radio-controlled plugs to try and save you money from vampire devices.

After the big hits of insulation and energy efficient lightbulbs, many of the quick wins from energy efficiency programs are fading away. Which leaves consumers confused. The Shelton Group’s CEO, Suzanne Shelton, summed it up in their recent Utility Pulse 2013 report. She points out it’s another case of the Snackwell’s effect. She sees consumers saying, “I bought these CFLs so now I can leave the lights on and not pay more. I bought a high-efficiency washer and dryer because I want to do more laundry without paying more. I ate the salad, so I can have the chocolate cake.”  Suzanne argues that an energy disconnect has led to defeat, with users feeling victimized by their energy bills and powerless to the point where they’re making fewer energy-efficient improvements. Shelton’s research shows consumers made only 2.6 improvements in 2012 compared with 4.6 in 2010. That level does not translate to significant energy savings.

Nor were they necessarily useful improvements. 81.9% of them claim to turn off lights when they’re not in a room. That’s good, but it won’t save them much money once they’re moved to LED lighting. 51.5% unplug chargers when not in use, whilst 50% unplug small appliances, which are probably turned off anyway and consuming no energy. These are no longer big wins. When questioned about the remaining elephant in the room – HVAC usage, just over half say they turn down their thermostat in winter and turn it up in summer. But do they?

And that’s the problem. With very few exceptions, we don’t know what consumers actually do with their energy. The US has rolled out over 40 million smart meters, which are as dumb as anything called smart could possibly be. They’ve been designed to make billing and bill disputes easier for the energy supplier, not to let either utility or users  understand what is consuming the energy they pay for. It means that no-one knows whether a single cent spent on Ratepayer Funded Energy Efficiency programs has actually worked, as there’s no evidence base. Consumers may have made these changes anyway. In which case, Ratepayer Funded Energy Efficiency schemes have just been a tax on common sense. The fact that energy use has decreased in many countries suggests that could well be the case.

On the other hand, they may have worked. The billions spent on TIPs (that’s Technology Installation Programs, like putting in demand response hardware, giving away compact fluorescent light bulbs or In Home Displays, or providing door insulation) could have been solely responsible for the savings. Even the slightly dodgier ratepayer funded initiatives, such as TV advertising may have worked. In which case the fact that others in the world followed the same trend must just have been chance or coincidence. Again, we don’t know.

What is needed within the industry is verification. There are ways of trying to understand what is really happening and whether these schemes do what they claim. Some smart meters can record data more frequently, which allows data analysis companies to use a technique known as Non Intrusive Appliance Load Monitoring (NIALM), or Load Disaggregation to analyse how the energy is being used in each household. It allows the pattern of energy consumption before and after any initiative to be compared to see what is responsible for the overall change. It’s the only real evidence-based way to tell whether a reduction is a direct consequence of the program. Without it, nobody knows whether the ratepayer funded program has delivered any value, or is just a meaningless tax on energy users.

Consumers have a right to know. Having picked and eaten the low hanging fruit, the next trick in the ratepayer funded scheme is to give away vouchers to help users purchase more efficient appliances. No-one denies that more efficient appliances are good, but few point out the reality of buying and running them. An exception is the US Energy Information Administration, which is a mine of useful information. They estimate that if you decide to buy the most efficient refrigerator, it will take you 19 years to recoup to upfront and cumulative running cost. They point out that only around 8% of homes have a fridge which is that old, so giving out a voucher which is funded by higher electricity rates is a double whammy for 92% of the population, but a nice little earner for the energy suppliers giving out the vouchers.

Almost everyone knows that it an inherently good thing to save energy. But is it so good to warrant asking us to pay for the privilege?  In an age of smart metering, surely both regulators and energy suppliers should have a responsibility to provide evidence to back up a policy that will cost the US taxpayer almost $100 billion dollars over the next decade. Remember that this is more than the entire stimulus funding that went into Smart Grid. It’s time that blind belief was replaced by some hard facts.