Insights from the DOE Guidance: Building a Strategic Evaluation Plan with Actionable Intelligence

Posted
on
August 14, 2023

Read Part 1 of this blog series here.

Let's face a hard fact.  Evaluation compliance can be intimidating, even scary. It is frequently seen as an annoying task implementers and program administrators have to do but rarely get any value from. Why? Historically evaluation has come too late in the process to help course-correct and it gets a bad reputation when it is only used as a hammer. 

But what if the data, measurement, and verification instead served as the wood, the nail, or even the blueprint to help you build your dream program? In this post we'll cover the core evaluation requirements in the DOE guidance for IRA-HOMES and make the case for embedded measurement and verification (M&V) to help states comply with evaluation expectations and build for the successful home performance programs of the future.

Prepare For DOE-Led Evaluation By Tracking Your Results

The U.S. Department of Energy (DOE) has clearly laid out the data and measurement expectations for evaluation in section 3.1.6. Program Requirements: Data Collection and Evaluation. DOE states that it will conduct process, impact, and market transformation evaluations of programs over the ten-year period. States, therefore, need to design programs with this requirement in mind.  States may choose to opt out of these evaluations if they prefer to do their own, as long as their own evaluation meets the requirements outlined by DOE. 

Data collection will be a big part of this program's implementation, creating a great opportunity for states to set a new bar for data-driven optimization in their program designs. It boils down to four key pieces of information: eligibility, savings, project cost, and home type. By embedding measurement and verification (M&V) software and practices in the operations of the program, states and program implementers can optimize programs in flight rather than waiting for a final result to come back from a DOE-led evaluation, possibly years later. States, program administrators, and service providers can use that local intelligence to continually adapt to deliver successful outcomes for residents in their state. 

Leaning into embedded measurement and verification allows administrators to track and monitor the full program, see the distribution of site-level results — both requirements outlined in the guidance. Embedded M&V also helps administrators assess aggregator performance and target support if aggregators are struggling to deliver results for residents. Requirements in the guidance that states must "take corrective action if actual savings results are less than 70% of estimated savings," means that having a systematic way to stay on top of actual results critical. 

Active intelligence on project performance also gives  aggregators the rapid feedback they need to continue to improve with each project delivered. Aggregators can use this information to make important adaptations that will help accelerate program adoption and improve customer experience. Many aggregators have experienced the value of this powerful feedback first hand.  

State administrators can benefit from this stream of intelligence to fulfill the standard reporting requirements of DOE-led evaluation with no extra hassle. By keeping some local control and regularly viewing actual results, states can mitigate evaluation surprises. This also ensures that programs consistently deliver meaningful results in line with state objectives.  

For some states, opting out of the DOE-led impact evaluation may make sense. DOE asks, however, that the plan meets certain criteria so the results will be comparable and reliable. Compliant evaluation plans must:   

"Appl[y] the DOE evaluation recommendations as described in the Technical Information, Best Practices, and Implementation Recommendations and collect DOE-requested information to allow for consistent comparison of results."  

Adopting existing open-source advanced M&V software and including it in the on-going tracking and monitoring plan will help states meet many of these recommendations and expectations for valuable feedback. 

The guidance also asks states to submit their evaluation plans soon after the program launch.  DOE requires states to: 

"Submit(s) the evaluation plan within three months of rebate program launch for DOE approval."

This requirement reflects the emerging best practice of running evaluation alongside implementation because of its potential value to enable data-driven decision-making. Having an M&V plan is essential for running measured programs, since performance is the basis of payment — and everyone in the system has to be on the same page on methods and calculations. As a result, measured programs typically require administrators to submit detailed M&V plans concurrently with the program plan. It's also a smart idea to have a systematic feedback loop for the program, regardless of the design. 

The guidance also sets a high bar for the quick turnaround of evaluation results; unless otherwise agreed to, programs have a year and a half to get results to the department. States must:  

"Provide(s) final evaluation results to DOE within 18 months of evaluation approval, or the agreed-upon timeline in the evaluation plan."

Without ongoing, embedded M&V, this expectation will be nearly impossible to meet. Automated M&V software can provide the calculated impacts for a streamlined evaluation process and, as noted already, the significant added benefit of feedback to optimize a program in flight.  All programs will be required to collect data needed for evaluation. States that utilize a measured program approach will have an open-source advanced M&V software solution as part of their delivery ecosystem. Project and program-level data will be archived and retrievable when evaluators knock at the door. Aggregators, administrators, and state agencies will be able to see results quickly — well within the DOE-required timeline of 18 months — and have a little more control on communicating results to legislatures, governors, and the general public in their own states.  

Use Open-source Advanced M&V Software to Meet Impact Evaluation Criteria  

The OpenEEmeter is an example of software designed exactly for delivering on-going impacts for every project, aggregator portfolio, and the program as a whole. With the OpenEEmeter, administrators have documentation of the change in monthly and hourly (if available) weather-normalized energy use of a home before and after implementing a home energy efficiency retrofit.  Partially funded by DOE labs, the OpenEEmeter was developed and tested in an open-source peer working group with several evaluation professionals. It has also been deployed with utilities, regulators, and aggregators nationwide for over a decade.  

Because the methods and all calculations are transparent, using the OpenEEmeter means that evaluators working for DOE can easily audit and review results. If states opt out of the DOE, impact evaluation results will still be comparable in the national study. Including a plan to track and monitor program impacts with embedded open-source M&V is a no-regrets component of a state's application.  

Streamline the Path to a Compliant Plan with a Measured Program Design

Given the overarching data and evaluation requirements for the program,  the measured program design option, which provides rebates based on calculated savings, may be one of the easier ways to meet expectations for all of the evaluation criteria with no incremental effort. The measured approach has the added benefit of providing flexibility for aggregators to offer valuable services to customers by incorporating the rebates into their offerings. The aggregators, not the customer, take on the performance risk.  

Since the rebates are based on actual changes in energy usage, the measured approach includes open-source auditable measurement and verification in the program design to provide near real-time feedback on performance.  States choosing a modeled approach can still use open-source M&V. This ensures the models are accurate, that the program is delivering actual results to the state's residents, and that it meets the evaluation compliance criteria.

Calculate Measured Energy Savings In Alignment with Current Practice

DOE's guidance on how to calculate savings for the measured approach did not contain many surprises.  Since the calculation of the rebates was outlined in the legislation, the guidance just needed to round out the edges on some specifics. The result was very similar to existing practice for measured programs being implemented today. A few additional definitions were included in the guidance, as well as some specific expectations on steps and timing for feedback.  Most of these specifics were intended to ensure the integrity of the measurement.   

When a State is using a measured savings approach there are four key pieces of information necessary and they align with the current practice for measured programs. In Section 3.1.1.1. Measured Home Efficiency Rebates DOE specifies that the State must calculate rebates based on:(1) the reported energy savings measured through a DOE-approved open-source advanced M&V software, (2) household income level, (3) total project cost reflected in the final invoice or a payment rate as defined in Table 3, and (4) home type consistent with the definitions in sec. 2.1. 

Highlighted below are other definitions and steps found in Section 3.2.4.2 Calculating Measured Energy Savings of the guidance.  In some cases, the guidance deviates a bit from current practice but for the most part represents continuity with the core principles and practices of measured programs.

  • Measured programs "must use advanced open-source M&V software, as approved by DOE, that includes capabilities for determining and documenting weather-normalized energy use of a home or portfolio of homes before and after implementing home energy upgrades."  This rule is straightforward and in alignment with the IRA legislation and industry practice - no additional definitions for key words were outlined in section 2.1 Definitions.  
  • Estimates of energy savings for a participating home or multi-family building must be "based on the data and information collected in a home assessment." Collecting information on what equipment is being installed is common, but the guidelines for the pre-condition data for a home prior to an intervention (see section 3.2.2 Program Requirements: Home Assessments), is significantly more than what's typically collected in measured programs in the field today.  Extensive detail on pre-conditions is not typically collected for measured programs because measured performance payments drive accountability, not pre-implementation project approval. 
  • Just like in other measured programs, the software "Defines, calculates, and reports energy savings for the purposes of the rebate threshold as kWh or kWh equivalent".  DOE noted that kWh equivalent refers to energy savings, rather than other benefits a state may recognize in their value stack, even if those benefits are the basis of the state's goals.  
     
  • To ensure the integrity of measurement, DOE requires that "calculation of actual home- or portfolio-level savings include no less than 9 months of data after the final installation in the home or portfolio to ensure that seasonal effects are captured in the results," and that "if savings calculations are from less than 12 months post-installation, the calculation must include at least one peak energy season and both peak seasons if in a dual-peaking climate."  Current measured program designs typically allow for quarterly or monthly payments based on calculated performance with an annual true-up for seasonal effects.  
  • The expectations for data collection are largely in line with standard practice for measured programs. DOE guidance requires that administrator "collects and reports the data and information required in the Data & Tools Requirements Guide and PNNL expectations on data to be collected."  More on this topic in our next blog.

One of the core benefits of the measured path is its ability to mitigate risk across a portfolio of projects.  Only the measured program model allows rebates for savings achieved to be paid across a portfolio of homes to meet the minimum savings thresholds (15%) and when calculating corresponding rebates. DOE added a little more guidance for reporting projects for a portfolio of homes.  States must:

  • Meet the energy savings minimum across the portfolio.
  • Report required data for each dwelling unit within the portfolio.
  • Calculate the final rebate based on the payment rate (see Table 3 of the guidance) applied to the energy savings of each home and summed across the portfolio.

The definition of and expectations of a portfolio-level payment is also in line with typical measured programs in the field today.  The notable difference is that most do not have a minimum savings threshold.  The calculation of eligible performance incentives is done for a site and then aggregated to the portfolio and reported for the full program. 

Finally, two key enforcement expectations put a particularly bright light on the value of using the measured approach and embedded measurement and verification for HOMES irrespective of the path taken.  For both measured and modeled program designs, the states must: 

  • Monitor the energy savings reliability of models and tools. While this is a reasonable expectation to ensure that models and tools are delivering, the reference to reliability is important as well. Open-source models have served as a reference in evaluations and should in this instance as well.
  • Take corrective action if actual savings results are less than 70% of estimated savings. Only the measured program pathway is designed to track actual savings.  For a modeled program to comply with this requirement, a state would need to calculate actual savings at the utility meter and compare it to predicted values. A measured program would already have this information in hand and could trace results back to underperforming projects and aggregators for further inquiry.

Embrace the Opportunities of Evaluation Measurement & Verification for Success

The DOE clearly cares that the impacts are real - but that doesn't mean it has to wait until the end.  They have set a new standard and expectation for tracking and monitoring impacts to ensure that the IRA's significant investment in HOMES and HEEHRA yields robust results and charts a path for the future. Considered carefully and incorporated into implementation plans, embedding M&V will allow states, implementers, and aggregators to optimize their program delivery and deliver tangible outcomes for state residents. Meeting these requirements doesn't have to be intimidating - it's value add. The tools are available today, built on historic M&V best practices and adapted to deliver consistent, transparent feedback for you to succeed.

Our next blog will dig into more detail on the data and security requirements in the guidance. Please contact us if you'd like to dig deeper into the guidance!

Get the latest Recurve articles, updates and webinars directly to your inbox.

Subscribe Now
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Support Center
Contact Us