WAPA phases out Equipment Loan Program

Based on a recent evaluation by WAPA’s Assessment Team, Energy Services is sunsetting its popular Equipment Loan Program. The Assessment Team, which was established in 2017, has been studying WAPA programs and initiatives to ensure that they support WAPA’s mission and bring value to the customer. The evaluation concluded that the program had successfully accomplished its original objective of giving power customers the opportunity to test out expensive diagnostic tools that might help them with planning, operations and maintenance.

Diagnostic tools like the Sense Home Energy 3300 power monitor have become less expensive, more accurate and easier to handle than the monitors of even 10 years ago.

WAPA launched the Equipment Loan Program more than 30 years ago when diagnostic tools were often large, cumbersome and expensive. The price of an infrared camera, for example, used to run to several thousand dollars for a basic model. Now you can pick up a pocket-sized camera at Home Depot for a little more than $200. There are even apps you can download to take IR pictures with your cellphone. Likewise, anemometers and weather stations have come down in price so that entities on a tight budget—schools, small municipal utilities—can afford to purchase their own.

Keeping pace with the latest technology has also become a problem for the Equipment Loan Program. The technology behind the tools used to change more slowly, so the program could provide customers with state-of-the-art equipment, or close to it. Today, a new and genuinely improved model seems to come out every couple of years. Even with more affordable prices, updating the tool library becomes an expensive proposition. At the same time, customers often can buy the latest version of a particular tool without denting their own budgets.

These changes in the marketplace have led to a sharp drop in the number of customers using the Equipment Loan Program. At the same time, many of the tools have become outdated. Were the program to continue, bringing the library up to date would be costly. The decision to end the program saves about $177,000 annually—funds that can be directed toward efforts that offer customers greater value.

All of the existing loan requests have been filled and we are in the process of retrieving the equipment so it can be disposed of as federal law requires.

Going forward, WAPA customers will have to make other arrangements for their equipment needs. However, most of the diagnostic tools in the Equipment Loan Program library are readily available from local vendors for rental or purchase. Also, you can contact your regional Energy Services representative for suggestions on where to find tools.

Your support of the Equipment Loan Program over the years has made it a highlight of Energy Services. It has allowed us to meet our customers, learn about your unique operations and find solutions that improve safety, efficiency and occasionally your bottom line. As hard as it is to say goodbye to the Equipment Loan Program, we consider it a success to retire a program that has served its purpose and met your needs.

ACEEE: Economists, energy practitioners need to work together to improve energy efficiency programs

In a recent blog post You are leaving Western's site., Steven Nadel, executive director of the American Council for an Energy-Efficient Economy, suggested that energy-efficiency programs could benefit if economists and energy professionals combined their skills, instead of talking past each other.

In the past year, economists have been producing more and more papers questioning the effectiveness of energy-efficiency programs and policies. Acknowledging that not all programs are well-designed, Nadel pointed out that the studies, too, have flaws that prevent them from providing meaningful evaluation.

One problem, he observed, is that the two industries use different methods to measure results. Economists tend to prefer rigorous evaluation through randomized control trials. In these studies, a large group of potential participants is randomly assigned to either a study or control group. But randomized control trials can be very difficult to implement, as even some economists admit. In full-scale programs that are available to all utility customers, random assignment to a control group is simply not possible.

In recent years, the energy-efficiency community has increasingly relied on the use of “deemed savings estimates” that are supposed to be based on prior evaluations. Unfortunately, these evaluations are not always as rigorous or as frequent as they need to be to give an accurate estimate.

Some study designs evaluate only certain aspects of a program, while overlooking goals and benefits that were central to the implementers’ intent, the ACEEE executive director said. He also noted that there have been times economists applied conclusions drawn from one evaluation to programs that have little in common with the one studied.

Nadel proposes that the two sides need to work together; first, to identify typical and similar program models for study; and second, to develop evaluation methods for those programs that combine each community’s professional strengths. Economists tend to be good at research methods, he notes, but don’t always understand the markets they are evaluating. Energy-efficiency program managers need to convey to researchers the program goals, and potential benefits that go beyond simple cost-benefits analysis.

Evaluation of energy-efficiency programs to determine what works—for utilities and customers—is an ongoing challenge for program designers. Nadel concluded that if the economic and energy-efficiency communities could learn to collaborate rather than work in silos, the studies they produce could lead to more effective programs.

Source: American Council for an Energy-Efficient Economy, 12/8/15

Ask the Energy Experts: Measuring results for appliance rebate programs

eexpert_smallQuestion:
Our utility is designing a rebate program for customers who purchase energy-efficient refrigerators. To help us estimate potential savings, we need information on the energy consumption of older refrigerators that may run longer (during each on/off cycle) or even continuously.

Answer:
The average cycle times of a well-maintained refrigerator should not change as the appliance gets older, unless it is somehow damaged. If a refrigerator does run continuously, something is wrong. Possible reasons for older refrigerators running longer or continuously include:

  • Dust and debris buildup on the condenser coil
  • Blocked internal vents inside an overloaded refrigerator
  • Damaged door seals or door misalignment
  • Frost buildup on the inside of manual-defrost refrigerators, or a malfunctioning defrost mechanism on automatic-defrost refrigerators
  • Partial loss of the refrigerant charge due to slow leakage of refrigerant

Adding a repair component to your program may expand its reach to increase energy savings and build trust with your customers. Also, consider using bill stuffers and other outreach opportunities to educate customers on routine refrigerator upkeep. Find more tips for efficient refrigerator operation and Energy Saver, a website by the Department of Energy to help consumers reduce their carbon footprint.

ENERGY STAR-qualified refrigerators use 15 percent less energy than non-qualified models. Models with top-mounted freezers use 10 to 25 percent less energy than side-by-side or bottom-mount units.
ENERGY STAR-qualified refrigerators use 15 percent less energy than non-qualified models. Models with top-mounted freezers use 10 to 25 percent less energy than side-by-side or bottom-mount units. (Artwork by DOE Energy Saver)

Calculating savings
Once a refrigerator reaches the end of its useful life—which it may be if it is running continuously—it is going to be replaced anyway. Usually, the goal of an incentive program is to encourage customers to replace older, less-efficient, but still functioning, appliances with high-efficiency models. A properly functioning older refrigerator uses less energy than the continuously running one, so calculating your program’s energy savings based on the latter will give you an overly optimistic estimate.

Documentation of Calculation Methodology, Input Data, and Infrastructure, by Home Energy Saver and Lawrence Berkeley Laboratory, is a useful publication for calculating the energy consumption of major appliances, including older refrigerators. Updated in 2008, this reference includes energy factors for several different refrigerator styles from 1972 to 2003. A table gives typical refrigerator sizes and there are equations for calculating adjusted volume and energy consumption.

Home Energy Saver also provides a chart that gives default energy consumption estimates for a variety of home appliances and systems.

Other resources you may find helpful include the Energy Star Refrigerator Retirement Savings Calculator. It allows visitors to estimate the energy savings for replacing refrigerators manufactured up to 2008 with an Energy Star-qualified model. Top Ten USA shows annual energy cost savings for the 10 most efficient refrigerators on the market. Knowing what refrigerator models are readily available in your area might be helpful as well in estimating program savings.

Estimating Appliance and Home Electronic Energy Use, an article on the Energy Saver blog, makes good reading for customers who are particularly engaged in managing their home energy use. Cultivating a relationship with your “true believers” is a good way to gain anecdotal data on the real-life performance of your programs. This information can be valuable in refining existing programs and developing new ones.

Thermostats are emphasis for newest PLMA interest group

The Peak Load Management AllianceRedirecting to a non-government site (PLMA) has announced that Brian Doyle and Lee Hamilton of Xcel Energy are co-leading PLMA’s new Thermostat Interest Group.

The PLMA Thermostat Interest Group will examine the costs and benefits of all types of utility-sponsored programs that leverage thermostat technology to deliver demand response, energy efficiency or other system benefits. A group goal is to identify the resources and partners that best communicate the value of smart thermostats to utility programs, rather than to focus on a specific technology or solution.

The group intends to collect documentation from published and not-so-public sources such as utility thermostat pilot and program evaluations. These and other third-party resources covering technology evaluations, program design concepts, market assessments, savings potential and more will be selectively shared with members.

The group has already conducted an initial meeting with founding PLMA organization representatives. Based on a strong interest level, the group will host a half-day workshop on Nov. 3 prior to the 15th PLMA Fall Conference in Philadelphia.

PLMA Interest Group membership is restricted to representatives from PLMA member organizations, but any organization is welcome to join PLMA. Source: Peak Load Management Alliance, 9/15/14

AESP conference connects implementation and evaluation

Oct. 15-17
Long Beach, CA

In the world of demand-side management (DSM), treating program management and measurement as unrelated components makes everyone’s job that much more difficult. Learn how to deliver more effective DSM programs by breaking down those silos at Implementation and Evaluation, No Longer an Odd Couple Redirecting to a non-government site, presented by the Association of Energy Services Professionals Redirecting to a non-government site(AESP).

This conference is for anyone whose job is to implement and measure energy-efficiency programs, with a focus on utility experience. The event kicks off with a special utility-only roundtable session where utility program managers will share common challenges and solutions, and a pre-conference training course on Demand Response.

Explore the latest developments in the worlds of DSM program management and evaluation, and delve into the dynamics of the implementer-evaluator partnership. Energy professionals from across the country will share insights, best practices and successful case studies that show you how you too can leverage the partnership to achieve mutual objectives.

Register for the AESP Fall Conference.

Proving the savings of energy-efficiency programs

This session focused on creating plans to evaluate, measure and verify energy efficiency programs.

David Reynolds of Energy & Resource Solutions presented a case study on a collaborative effort by California public power utilities to evaluate their programs. The motivation was two state bills that required investor-owned utilities to verify savings from energy-efficiency programs.

As a first step, the utilities worked together to develop an energy efficiency database and reporting tool.  A contractor was hired to train the staff, but the training went both ways, since the contractor had never worked with public power providers.  The collaborative developed its implementation plan, implemented the plan at their individual utilities and shared lessons learned.

What they discovered was that there is a lot of room for improvement in data collection and tracking systems. Even so, feedback is valuable for program design—it shows what measures work and helps utilities to improve program effectiveness.

Economies of scale can reduce the cost of operating programs. Leverage the information that is out there. Prioritized the program elements to be evaluated and spread the effort across several years.

Evaluation is a quality assurance tool driven by documentation, and it is your best chance of keeping program costs down.  It’s just good program management by showing you are in charge. It instills a level of confidence in the program, and proves your efforts align with state or national policies.

For the process to work, you must be clear why you are doing evaluation in the first place. Don’t try to evaluate too many aspects if the program.  If you can write a clear concise question, you can set your goal. Don’t try to answer too much. Design evaluation to answer a singular question.

If you are working with a third party provider—or even if you are conducting your own evaluation—get your documentation together. That includes a resource portfolio structure, program description, rationale, customer markets, forms agreements, rules and rebates. Assess your data management system and existing quality assurance procedures up front. Include the evaluation in your program budget.

The plan is the heart of your evaluation framework. It should include program summary, tracking and reporting, overall priorities, secondary objectives, overall approach, accounting principles and preferred methodology.

The budget will define and limit your evaluation effort.  California utilities were spending 1 percent of their operating in the early ’00s. The national average now hovers between 3 and 5 percent, Some evaluation consultants suggest between 4 and 16 percent.  There will be some additional costs for new programs. Focus on an in-depth effort if your funds are limited.

Tracking and reporting is expensive. Have a least-cost preference for how you are going to collect data.  Do data collection in house as much as possible.  Move it onto customer where you can. Lowers cost. Move from utility to third party to customer.

 Data should be easy to share. Archive applications so you don’t lose data when the staff turns over. Analysis is all difficult so make it simple where you can.

The report is a technical document, so you need to communicate the good news. Be your own advocate. Frame the discussion, so other entities don’t do it.

Read reports on the Northern California Power Agency, California Measurement Advisory Council and California Public Utilities Commission website for more information.

Program measurement in Florida

Gainsville Regional Utilities had 22 energy-efficiency programs in 2009 and chose four to evaluate: refrigerator recycling, air conditioner upgrades, attic insulation and duct sealing and repair.  The choice may be based on program expense or popularity, or political considerations.

Programs that are unaffected by externalities are easier to evaluate. “Measures in, savings out,” programs such as the refrigerator buyback and recycle program or light bulb giveaways fall into this category.

More complex programs are those where the savings are impacted by environmental factors. Attic insulation, duct sealing and air conditioner upgrades will require more statistical analysis to evaluate. You will have to look at what is driving savings and how market noise impacts customers.

It will be necessary to establish a control group and develop a model of normalized annual consumption, with estimated energy impacts based on statistical adjusted engineering models. Obviously, engaging a third party consultant may be worth every penny.

Gainsville discovered that estimated savings equaled out across the program suite, though adding in cost variables improved the payback. But the point is to be able to identify programs that are not performing well and either fix or eliminate them.