Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01

Comment #1

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1
Paragraph / Figure / Table / Note: 904.3.2.3.2
Comment Intent: Objection
Comment Type: General

Comment:

The 'new' proposed calculation method for File QA can only work if RESNET QA Staff has full access to the data collection tool that a QA Provider uses. That's the only way to actually ensure the data is being collected. Otherwise, you're giving vertically integrated QA Providers carte blanche to commit fraud. We've all heard the rumors over the years about this rating company or that rating company and how they do driveby ratings.

Quite literally giving a QA Provider the "choice" to do less work, with no requirements for greater oversight directly from RESNET, is going to lead to issues. Especially with the 45L being tied to ENERGY STAR and Zero Energy Ready Home certifications.

I recommend language be added that mandates QA Providers give RESNET QA Staff full access to the data collection tool and all the data collected 24/7/365.

Proposed Change:

904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

904.3.2.3.2.1 QA Providers that opt for the alternative calculation rate shall give RESNET QA Staff full access to all systems used to collect field data and documentation on ratings from the lifecycle of the rating.


Comment #2

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1
Comment Intent: Objection
Comment Type: General

Comment:

My experience and suggestion directly addresses this, but in another way! Please consider it!

Once a rater agrees to become a rater based on the standard of the time and is certified, the renewal standard for that rater should not change. Its just not fair and balanced to change renewal requirements once the two parties have executed upon an understanding about how to renew. This is just my fair and balanced opinion, but I am sure others may feel the same. 

This came up for me while trying to meet the new rater renewal requirement that was added into the standard just prior to the start of the pandemic.

My Provider attempted to get access for QA to my only confirmed rating after the home was sold by the builder 6 months after my renewal cycle was up. My QA provider was denied access for QA by the new howeowner. Seeing as that was my one and only confirmed rating, that placed me into a discimplinary status. My provider did offer some options after but at that point in time based on distance and available time, I was unable to accept due to personal reasons that were going on at the time. I can only guess at this point, the new home homeowner evidently did not wish to unnecessarily allow strangers into their habitat. Based on current pandemic events, the homeowners actions now and for others in the future are reasonable. From a homeowners point of view, in this case these people asking for access came from across multiple states visiting different communities, homes, and people. For a homeowner to allow access, it is a risk they need to be able to voluntarily accept. For renewal of certification purposes only, HERS Raters and Providers should not be placed into impossible gatekeeper situations by the standard when it comes to providing access to these homes.

Furthermore I will like to point out, this places an undue hardship and expense upon the “Providers”. It is not right they should have to return to a city or state many miles away to perform another attempt at access to another home later when one was already made for a “Confirmed Rating” but denied access by a home owner at the door.
In respect to the current world we live within, I am requesting the RESNET board, give due consideration to this requirement, section 102.1.2.4.3 of the standard and for the purposes of previous and all future renewal of certifications not to deny acknowledging a raters confirmed rating when a diligent attempt for access was made but denied at the at the door by a home owner!

I will like to suggest another alternative that may help with the Provider work loads and maintain RESNET representation in local communities: for raters who do not actively upload ratings on a regular basis who wish to keep certification for credential purposes in representing the RESNET standard to local legislatures, local building officials, and communities is to: allow previously certifed raters to renew certification without a confirmed rating, but once this renewal without a confirmed rating is exercised until the rater performs a new field evaluation the rater will no longer be able to upload ratings to the RESNET data base but will retain RESNET Rater certification credentials and status. By changing this, this will allow raters to renew without being placed into a disiplinary status simply due to the fact that a homeowner denied access and non active raters may continue to represent the RESNET standard while still in compliance.


Comment #3

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1
Paragraph / Figure / Table / Note: 904.3.2.3.2
Comment Intent: Objection
Comment Type: General

Comment:

The alternative calculation method is going to let a lot of crap data into the RESNET Registry. Per the self-reported QAD review time, most QA is done after the project is registered. Why let even MORE bad data into the Registry with less oversight?

The alternative calculation method should only be allowed when the QA Reviews are completed BEFORE the projects are registered with RESNET's Registry.

Proposed Change:

904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

904.3.2.3.2.1 The alternative QA rate is only allowed when the QA Provider QA file reviews are performed prior to a rating being registered with the RESNET National Buildings Registry.


Comment #4

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1
Paragraph / Figure / Table / Note: 904.3.2.3.2
Comment Intent: Objection
Comment Type: General

Comment:

The alternative file and field QA rate takes into account the entire QA Provider's annual total ("904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population)..." and "904.3.3.2.1.1 Flexibility to choose which Raters to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count...").

This is a fallacy of thought, as not all QA Providers are vertically integrated. Some, like Building Efficiency Resources (aka, The BER), also serve as a third-party QA Provider for independent rating companies. 

The language proposed would create an unfair system for rating companies that use a 3rd party QA Provider compared to a vertically integrated provider. The rating companies that use a 3rd party QA Provider would have a competitive advantage against vertically integrated QA providers.

Example: A vertically integrated QA Provider does 3,000 projects in a given year. Per RESNET's sample set calculator, 341 (rounded up from 340.55) projects would need to be reviewed. A 3rd party QA Provider that also does 3,000 projects would obviously have the same number of 341, but if they had 50 rating companies under contract for QA (or more), the only backstop to this is that each Rater of Record would need to have at least ONE file review in a given year. If that number was evenly spread across the companies (341 / 50) you'd potentially only perform 6-7 file reviews for EACH of those rating companies (assuming again that there's an even number of raters working, etc).

How is it fair for a vertically integrated QA Provider to have to perform 341 file reviews on their own work, when a rating company working under a 3rd Party QA Provider can squeak buy with literally 2% of the same amount of review?

I suggest changing the language to be based on a per Rating company basis, to have a more level structure.

Proposed Change:

904.3.2.3.2 Based on the Provider’s a Rating company's annual total HERS Ratings combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

 

904.3.3.2.1.1 Flexibility to choose which Raters to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider a Rating company's annual total HERS rating count, selecting the newer or lower performing Raters to receive more field QA reviews and higher performing Raters to receive less.


Comment #5

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1
Paragraph / Figure / Table / Note: 904.3.2.3.2
Comment Intent: Objection
Comment Type: General

Comment:

I think the proposed alternative QA calculation is a faulty decision that will water down RESNET as the supposed gold standard of the home energy rating industry.

For some context, I believe the largest rating company estimated doing over 100,000 ratings in calendar year 2022. Just rounding it to an even 100K, the estimated annual QA under the alternative calclation method would only require them to do 383 file reviews (minimum 1 file review per Rater of Record), which is down from 10,000 file reviews they're required to do under the current system. That's a reduction of 99%.

As the leader in ENERGY STAR New Homes certifications, I urge RESNET to follow their own words: "The current system has worked well for over two decades with no accusations of fraud nor maleficence. RESNET and EPA have worked well together on quality assurance and compliance complaint issues. We urge you to consider the old adage of “if it is not broke don’t fix it”."

( https://www.energystar.gov/sites/default/files/asset/document/9_RESNET_Feedback.pdf )

I'd suggest that, if anything, without structural changes like direct QA oversight from RESNET Staff, this proposed alternative QA calculation would actually lead to more issues of compliance and accusations of fraud.

Furthermore, I'd argue that attempting to add this alternative QA calculation rate based on a QA Provider's annual total, not by rating company or division of a vertically integrated rating company, is bringing RESNET's own words from 2019 to life:

"Allowing a new VOO to provide an inferior quality assurance program could water-down the value of an ENERGY STAR certified home thereby creating a race to the bottom that will result in distrust among the builders that are paying for the certifications as well as the consumers buying the homes."

( https://www.energystar.gov/sites/default/files/asset/document/RESNET_Response%20to%20EPA%20VOO-2_final.pdf )

I guess it's just surprising that the start of a "race to the bottom" is from the organization that self-describes their system as the gold standard.

I propose striking the alternative QA calculation rate completely until systemic, structural issues with the RESNET MINHERS QA system are fixed.


Comment #6

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 904.3.3
Paragraph / Figure / Table / Note: HERS Rater Quality Assurance Field review (QA Field review)
Comment Intent: Objection
Comment Type: General

Comment:

Hi there,

 

I am a QAD for ARCXIS here in San Antonio, TX. I've only been a QAD since last March. I completed majority of our field QAs in 2022 within our company and noticed some things. Just a couple of things I'd like to put my input with the current QA requirements. The first, is having to do more QA reviews on finals versus insulations. I see more issues during my QA reviews on insulations than I do on finals. These can be issues stemming from duct boots not being properly sealed, chases not being sealed properly or even sealed, doors or windows not being sealed, receptacles not being sealed, gaps between studs not being sealed, floor system not being sealed, penetrations into the thermal envelope not being sealed, voids in wall cavities, top and bottom plate gaskets/ foam missing These are issues I've witnessed during my reviews and in my opinion they are important to catch. Once I get to the final stage QA and I'm having issues with the blower door or duct blaster these can be issues that cannot be corrected. What are your thoughts on the current quota of having to do 50% of both and maybe changing that to doing a 60-40 split more towards insulations?
 

 

Thank you for reading and thanks for your time!
 

David.

Proposed Change:

Changing QA quota from 50% Final reviews and 50% insulation reviews to 40% final reviews and 60% insulation reviews to meet 100% quota.


Comment #7

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 6 & 7
Paragraph / Figure / Table / Note: 904.3.3.2.1.1 & 904.3.3.2.2.1
Comment Intent: Objection
Comment Type: General

Comment:

This is a joint Public Comment written by experienced QADs and endorsed by the company names or individuals listed below.  We found several issues with the proposed language for the new Field QA methodology proposed by this Addendum.  We have elected to propose an alternative option that we feel addresses the concerns we identify while still meeting the intent to introduce a merit-based approach for those Raters/RFIs who consistently are doing the right thing, allowing QADs to focus their efforts on working with those Raters who are not performing the standards correctly.

Below are our major concerns with the Proposed Addendum:

1. While we agreed with introducing a merit-based approach and incentivizing accordingly, we STRONGLY disagreed with basing that methodology on an aggregate comparison with all other Raters / RFIs with a given Provider, which is very ambiguous and could lead to wildly different applications depending on the size or average experience level of a Providership, or even different rankings between different Providers with what the cut off is between “higher performing” and “lower performing”.

2. We felt that the fact that the proposed total number of Field QAs was still being linked to 1% of a Providers total number of Ratings was introducing a new, punitive burden on smaller companies in Providerships with very large companies that might have a very well trained workforce, which the smaller company’s Raters might be seen as “lower performing” even though they still did ok on the Checklist, which might incentivize them to leave that Providership for another where there were a lot of underperforming Raters.  The Standards should never incentivize a Rater from leaving one Providership for another.

3. We felt the terms “higher performing” and “lower performing” were derogatory in nature and very unmotivating for an employee of a company, who either might be new or just struggling, to be deemed as “lower performing”.

Below is our solution to the concerns we have identified:

1. In general, we felt that the merit-based approach was correct, but that it should instead be tied directly to the average percentage score that the individual Rater or RFI achieves on the QA Checklist, as this is what we as an industry have moved to as our QA process, in which case they are only competing against themselves and are incentivized with less QA if they consistently do a good job on the tool we have shifted to for judging their performance.

2. In place of the language in the addendum, we are proposing a completely new Field QA methodology based off the idea of a Rating/RFI Tier, which a Rater/RFI would be a part of for a calendar year based off the previous year’s performance, and broke them out into four Tiers instead of only two “higher performing” and “lower performing” categories.  We felt that grouping Raters/RFIs into only two groups was not adequate enough, and that there should be additional groupings to allow for greater incentivization for good performance, while also possibly limiting the total number of people that were able to achieve the maximum benefit.

3. We included the pathway for average performing Raters/RFIs, or Tier III, to simply follow the existing 1% methodology

4. We added a Tier for Raters/RFIs out of compliance, or Tier IV, and outlined how they are subject to corrective action under 905.6 Corrective Action For QA Providers, Raters, And RFIs and Section 102.2.9.3 - Written Certified HERS Rater, RFI and HERS Modeler Disciplinary Procedures, as well as a pathway for Tier IV Raters/RFIs to become Tier III again.

  • By including reference to 905.6, this will ensure that if three consecutive File QAs are failed, even for a Tier I Rater, that the Rater will automatically be dropped to Tier IV for Field QA until the Corrective Action / Disciplinary Procedures have been met, and then will only be raised to Tier III until they can demonstrate repeated compliance again.

5. We incorporated and built on the idea of backstops, for minimum numbers of Ratings like had been introduced before, but also for tiered maximum amounts of Field QA for the higher Tiers I and II to incentivize good performance (which in practice would potentially mean less than 1% total for a Providership could be completed), as well as adding additional Field QA when a Rater or RFI is introduced to a new type of Rating they have never performed before, and for new Raters/RFIs to a Providership (either to just being certified, moving from one Providership to another, or becoming a part of a second or more Providership) to be defaulted to Tier III for the first year.

This joint Public Comment has been endorsed by the following companies as of 2/10/2023.

Energy Efficient Homes Midwest, Inc.
Hathmore Technologies/EnergySmart Institute
Bremen Energy Auditors, LLC
Prairie Insulation
Walker Energy Efficient Homes
15 lightyears
TruTech Tools, LTD
Florida Solar Energy Center
R Family Company LLC
Lenz Consultants
TRA Certification
Goley Inc
Bravo! Home Performance, LLC
QUANTUM RATERS LLC
Michigan Home Performance, LLC

Proposed Change:

904.3.3.2.1.1 Flexibility to choose which Raters to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing Raters to receive more field QA reviews and higher performing Raters to receive less.

904.3.3.2.1.1.1 Backstop: All Raters would receive a minimum of one field QA review per year

904.3.3.2.1.1.2 Exception: Raters performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of Raters in the Providership are allowed to have lower than 1% field QA rates. Raters who fall into the category of “field only” for their QA reviews are evaluated with the RFIs; AND

b. They received at least one field QA review the previous calendar year; AND

c. During the year of the exemption, the Rater shall receive 100% or 15 full file QA reviews, whichever is less.

904.3.3.2.1.1 Choose how much Field QA to perform based on the previous calendar year’s average percentage score for a Rater using the QA Review Checklist to determine the Rater Tier.  Providers will be allowed to perform Field QA based on the following criteria:

904.3.3.2.1.1.1 Tier I Raters:  Those Raters whose average percentage score using the QA Review Checklist over the previous calendar year is from 0% to 8% will be considered a Tier I Rater for the next calendar year.

904.3.3.2.1.1.1.1 Tier I Rater Field QA Requirements:  All Tier I Raters would receive a minimum of one Field QA review per year.  If a Tier I Rater performs more than 400 Ratings in a year, they would receive a minimum of two Field QA reviews per 200 ratings performed in the year, up to five Field QA reviews per year.

904.3.3.2.1.1.1.1.1 Tier I Rater Minimum Exemption:  Tier I Raters performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

? They received at least one field QA review the previous calendar year; AND
? During the year of the exemption, the Tier I Rater shall receive 100% or 15 full file QA reviews, whichever is less.

904.3.3.2.1.1.1.1.2 Tier I Rater Maximum Backstop: Tier I Raters will not receive more than five Field QA reviews per year unless required by 904.3.3.2.1.1.5 - New Rating Experience Backstop , by the Provider, or requested by the Rater.

904.3.3.2.1.1.2 Tier II Raters:  Those Raters whose average score using the QA Review Checklist over the previous calendar year is greater than 8% but less than or equal to 16% will be considered a Tier II Rater for the next calendar year.

904.3.3.2.1.1.2.1 Tier II Rater Field QA Requirements: All Tier II Raters would receive a minimum of one Field QA review per year.  If a Tier II Rater performs more than 300 Ratings in a year, they would receive a minimum of two Field QA reviews per 150 ratings performed in the year, up to eight Field QA reviews per year.

904.3.3.2.1.1.2.1.1 Tier II Rater Minimum Exemption:  Tier II Raters performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

? They received at least one field QA review the previous calendar year; AND
? During the year of the exemption, the Tier II Rater shall receive 100% or 25 full file QA reviews, whichever is less.

904.3.3.2.1.1.2.1.2 Tier II Rater Maximum Backstop: Tier II Raters will not receive more than eight Field QA reviews per year unless required by 904.3.3.2.1.2.5 - New Rating Experience Backstop , by the Provider, or requested by the Rater.

904.3.3.2.1.1.3 Tier III Raters : Those Raters whose average score using the QA Review Checklist over the previous calendar year is greater than 16% but less than or equal to 25% will be considered a Tier III Rater for the next calendar year.

904.3.3.2.1.1.3.1 Tier III Rater Field QA Requirements:  Providers will perform Field QA in accordance with 904.3.3.2.1 HERS Raters.

904.3.3.2.1.1.4 Tier IV Raters: Raters that produce enough ratings that receive a failing score on the RESNET QA Review Checklist to trigger corrective action under section 905.6 Corrective Action For QA Providers, Raters, And RFIS are considered Tier IV Raters.  These Raters must undergo disciplinary procedures, including an increased level of field QA,  as described in Section 102.2.9.3  - Written Certified HERS Rater, RFI and HERS Modeler Disciplinary Procedures.  Once the terms of these Disciplinary Procedures have been met, the Provider can move the Rater back to a Tier III Rater.

904.3.3.2.1.1.5 New Rating Experience Backstop:  If a Rater participates in a Rating including an experience type they have not worked with in the past within a Providership, a Field QA review will be required in that calendar year for each additional new aspect to ensure the Rater understands how to correctly perform and/or model the new rating experience.  This may count as one required field QA in the total number required based off that Rater’s Tier, or will be an additional field QA if the total number required has already been met for the year.  The following shall be considered a new experience type:

? New Construction Type (e.g. split-level, multi-family, etc.)
? A rating performed under a RESNET EEP including but not limited to the following:

? ENERGY STAR
? IAP
? DOE ZERH

? ANSI/RESNET/ACCA Standard 310 Inspections
? Other items at the Provider’s discretion

904.3.3.2.1.1.6 Determining Rater Tiers:  Providers will determine each Rater’s Tier based on the previous calendar year’s average percentage score for the Rater of Record for each registered rating the individual worked on using the QA Review Checklist. The tier for each individual for the new calendar year shall be recorded in the RESNET Registry before submission of the Provider’s Annual QA Report submission of the previous calendar year.

904.3.3.2.1.1.7 Initial Rater Tier:  Each newly added Rater to a Provider, whether those that are newly certified or those who have recently moved from one Provider to a new one, or begun working with a second or more Providers, will be considered as a Tier III for their initial calendar year with the new Provider.

904.3.3.2.2.1 Flexibility to choose which RFIs to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing RFIs to receive more field QA reviews and higher performing RFIs to receive less.

904.3.3.2.2.1.1 Backstop: All RFIs would receive a minimum of one field QA review per year

904.3.3.1.2.1.2 Exception: RFIs performing 100 or fewer final inspections OR pre-drywall inspections (whichever is higher) may be exempt from field QA in a given year if they meet the following criteria:

a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of RFIs in the Providership are allowed to have lower than 1% field QA rates.

b. They received at least one field QA review the previous calendar year.

c. During the year of the exemption, the Rater(s) of Record associated with that year’s total field work of the eligible RFI shall receive 100% or 15 full file QA reviews, whichever is less.

904.3.3.2.2.1 Choose how much Field QA to perform based on the previous calendar year’s average percentage score for a RFI using the QA Review Checklist to determine the RFI Tier.  Providers will be allowed to perform Field QA based on the following criteria:

904.3.3.2.2.1.1 Tier I RFIs:  Those RFIs whose average percentage score using the QA Review Checklist over the previous calendar year is from 0% to 8% will be considered a Tier I RFI for the next calendar year.

904.3.3.2.2.1.1.1 Tier I RFI Field QA Requirements:  All Tier I RFIs would receive a minimum of one Field QA review per year.  If a Tier I RFI performs more than 400 Ratings in a year, they would receive a minimum of two Field QA reviews per 200 ratings performed in the year, up to five Field QA reviews per year.

904.3.3.2.2.1.1.1.1 Tier I RFI Minimum Exemption:  Tier I RFIs performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

? They received at least one field QA review the previous calendar year; AND
? During the year of the exemption, the Tier I RFI’s Rater shall receive 100% or at least 15 full file QA reviews, whichever is less.

904.3.3.2.2.1.1.1.2 Tier I RFI Maximum Backstop: Tier I RFIs will not receive more than five Field QA reviews per year unless required by 904.3.3.2.2.1.5 New RFI Experience Backstop, by the Provider, or requested by the RFI or the RFI’s Rater.

904.3.3.2.2.1.2 Tier II RFI:  Those RFIs whose average score using the QA Review Checklist over the previous calendar year is greater than 8% but less than or equal to 16% will be considered a Tier II RFI for the next calendar year.

904.3.3.2.2.1.2.1 Tier II RFI Field QA Requirements: All Tier II RFIs would receive a minimum of one Field QA review per year.  If a Tier II RFI performs more than 300 Ratings in a year, they would receive a minimum of two Field QA reviews per 150 ratings performed in the year, up to eight Field QA reviews per year.

904.3.3.2.2.1.2.1.1 Tier II RFI Minimum Exemption:  Tier II RFIs performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

? They received at least one field QA review the previous calendar year; AND
? During the year of the exemption, the Tier II RFI’s Rater shall receive 100% or at least 25 full file QA reviews, whichever is less.

904.3.3.2.2.1.2.1.2 Tier II RFI Maximum Backstop: Tier II RFIs will not receive more than eight Field QA reviews per year unless required by 904.3.3.2.2.1.5 New RFI Experience Backstop, by the Provider, or requested by the RFI or the RFI’s Rater.

904.3.3.2.2.1.3 Tier III RFI: Those RFIs whose average score using the QA Review Checklist over the previous calendar year is greater than 16% but less than or equal to 25% will be considered a Tier III RFI for the next calendar year.

904.3.3.2.2.1.3.1 Tier III RFI Field QA Requirements:  Providers will perform Field QA in accordance with 904.3.3.2.2 Rating Field Inspectors.

904.3.3.2.2.1.4 Tier IV RFI: RFIs that work on enough ratings that receive a failing score on the RESNET QA Review Checklist to trigger corrective action under section 905.6 Corrective Action For QA Providers, Raters, And RFIS are considered Tier IV RFIs.  These RFIs must undergo disciplinary procedures, including an increased level of field QA, as described in Section 102.2.9.3  - Written Certified HERS Rater, RFI and HERS Modeler Disciplinary Procedures.  Once the terms of these Disciplinary Procedures have been met, the Provider can move the RFI back to a Tier III RFI.

904.3.3.2.2.1.5 New RFI Experience Backstop:  If a RFI participates in a Rating including an experience type they have not worked with in the past within a Providership, a Field QA review will be required in that calendar year for each additional new aspect to ensure the RFI understands how to correctly perform the new rating experience.  This may count as one required field QA in the total number required based off that RFI’s Tier, or will be an additional field QA if the total number required has already been met for the year.  The following shall be considered a new experience type:

? New Construction Type (e.g. split-level, multi-family, etc.)
? A rating performed under a RESNET EEP including but not limited to the following:

? ENERGY STAR
? IAP
? DOE ZERH

? ANSI/RESNET/ACCA Standard 310 Inspections
? Other items at the Provider’s discretion

904.3.3.2.2.2.6 Determining RFI Tiers:  Providers will determine each RFI’s Tier based on the previous calendar year’s average percentage score for the RFI for each registered rating the individual worked on using the QA Review Checklist. The tier for each individual for the new calendar year shall be recorded in the RESNET Registry before submission of the Provider’s Annual QA Report submission of the previous calendar year.

904.3.3.2.2.2.7 Initial RFI Tier:  Each newly added RFI to a Provider, whether those that are newly certified or those who have recently moved from one Provider to a new one, or begun working with a second or more Providers, will be considered as a Tier III for their initial calendar year with the new Provider.


Comment #8

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 21
Comment Intent: Objection
Comment Type: General

Comment:

dasdasddas

Proposed Change:

asdsa


Comment #9

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1
Paragraph / Figure / Table / Note: 904.3.2.3.2
Comment Intent: Objection
Comment Type: General

Comment:

Rather than using a ranking system to determine higher performing vs lower performing Rater & RFIs, we recommend that a defined average error rate threshold be the determining factor. All Raters & RFIs should have an equal chance of achieving the higher performing status, regardless of other Raters or RFIs in the Providership.


We also see the need to put some minimum requirements in place for newly certified individuals & those who do not meet the high performance threshold. The draft clearly states that higher performing individuals still require a minimum of 1 Field & File QA per year, so setting a clear minimum Field & File QA requirement for those lower performing & newly certified inspectors should also be clear. We understand that providers with only a few inspectors able to meet the threshold for higher performance may need to commit to additional Field QA in order to help their inspectors reach the higher performance goal.

We recommend the average error rate as the performance metric. The error rate would be calculated for each QA review using the formula below and then averaged with the total number of QA reviews for the previous 12-month period: 

Error Rate % = (Score * 100) / Total Overall Rating Points

Example: Checklist Score of 30, Total Overall Rating Points of 837, Error Rate Percentage of 3.58%

(30*100) / 837 = 3.58


Inspectors with a significantly high average error rate (>20%) should also be required to have increased Field & File QA reviews.

Once all the defined minimums have been fulfilled, any remaining Field & File QA requirements, based on the provider’s total rating count, will be allocated per QAD discretion.

Proposed Change:

Rater Field QA

904.3.3.2.1.1 Flexibility to choose which Raters to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing Raters to receive more field QA reviews and higher performing Raters to receive less.

904.3.3.2.1.1.1 Backstop: All Raters would receive a minimum of one field QA review per year

904.3.3.2.1.1.1.1 Any Rater who meets or does better than the high performing average error rate threshold of 10% on Field QA reviews from the past 12-month period shall require a minimum of one Field QA review in that year.

904.3.3.2.1.1.1.2 Any Rater who does not meet the high performing average error rate threshold of 10% on Field QA reviews from the past 12-month period shall require a minimum of two Field QA reviews in that year.

904.3.3.2.1.1.1.3 Any Rater who was newly certified in the previous 12-month period shall require a minimum of two Field QA reviews in that year.

904.3.3.2.1.1.1.3.1 If the newly certified Rater was previously working as an RFI, the RFI's average error rate from the past 12-month period can be used to determine the number of Field QA reviews required in that year.

904.3.3.2.1.1.1.4 Any Rater whose average error rate on Field QA reviews from the past 12-month period exceeds 20% shall require a minimum of three Field QA reviews, or 100% Field QA reviews whichever is less, in that year.

904.3.3.2.1.1.1.5 All remaining Field QA needs, based on the Provider's total number of ratings, will be allocated using QAD discretion.

904.3.3.2.1.1.2 Exception: Raters performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of Raters in the Providership meet or perform better than the high performing average error rate threshold of 10% on Field QA reviews from the past 12-month period, they are allowed to have lower than 1% field QA rates. Raters who fall into the category of “field only” for their QA reviews are evaluated with the RFIs; AND

b. They received at least one field QA review the previous calendar year; AND

c. During the year of the exemption, the Rater shall receive 100% or 15 full file QA reviews, whichever is less.

 

RFI & Rater Field Only Field QA

904.3.3.2.2.1 Flexibility to choose which RFIs to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing RFIs to receive more field QA reviews and higher performing RFIs to receive less.

904.3.3.2.2.1.1 Backstop: All RFIs would receive a minimum of one field QA review per year

904.3.3.2.2.1.1.1 Any RFI or Rater Field Only who meets or performs better than the high performing average error rate threshold of 10% on Field QA reviews from the past 12-month period shall require a minimum of one Field QA review in that year.

904.3.3.2.2.1.1.2 Any RFI or Rater Field Only who does not meet the high performing average error rate threshold of 10% on Field QA reviews from the past 12-month period shall require a minimum of two Field QA reviews in that year.

904.3.3.2.2.1.1.3 Any RFI or Rater Field Only who was newly certified in the previous 12-month period shall require a minimum of two Field QA reviews in that year.

904.3.3.2.2.1.1.3.1 If the newly certified Rater Field Only was previously working as an RFI, the RFI's average error rate from the past 12-month period can be used to determine the number of Field QA reviews required in that year.

904.3.3.2.2.1.1.4 Any RFI or Rater Field Only whose average error rate on Field QA reviews from the past 12-month period exceeds 20% shall require a minimum of three Field QA reviews, or 100% Field QA reviews whichever is less, in that year.

904.3.3.2.2.1.1.5 All remaining Field QA needs, based on the Provider's total number of ratings, will be allocated using QAD discretion.

904.3.3.1.2.1.2 Exception: RFIs performing 100 or fewer final inspections OR pre-drywall inspections (whichever is higher) may be exempt from field QA in a given year if they meet the following criteria:

a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of RFIs in the Providership meet or perform better than the high performing average error rate threshold of 10% on Field QA reviews from the past 12-month period are allowed to have lower than 1% field QA rates.

b. They received at least one field QA review the previous calendar year.

c. During the year of the exemption, the Rater(s) of Record associated with that year’s total field work of the eligible RFI shall receive 100% or 15 full file QA reviews, whichever is less.

 

Rater File QA

904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry  (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

904.3.2.3.2.1 Any Rater who meets or performs better than the high performing average error rate threshold of 10% on File QA reviews from the past 12-month period shall require a minimum of 1 File QA review in that year.

904.3.2.3.2.2 Any Rater who does not meet the high performing average error rate threshold of 10% on File QA reviews from the past 12-month period shall require a minimum of 5 File QA reviews, or 100% File QA reviews whichever is less, in that year.

904.3.2.3.2.3 Any Rater who was newly certified in the previous 12-month period shall require a minimum of 5 File QA reviews, or 100% File QA reviews whichever is less, in that year.

904.3.2.3.2.4 Any Rater whose average error rate on File QA reviews exceeds 20% shall require a minimum of 10 File QA reviews, or 100% File QA reviews whichever is less, in that year.

904.3.2.3.2.5 All remaining File QA needs, based on the Provider's total number of ratings, will be allocated using QAD discretion.


Comment #10

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: All
Paragraph / Figure / Table / Note: 904.3
Comment Intent: Objection
Comment Type: General

Comment:

The following comments are submitted on behalf of the U.S. Environmental Protection Agency (EPA) with respect to EPA’s ENERGY STAR program. Please note that the section of MINHERS in question is a lynchpin of RESNET’s application for recognition as an EPA-approved Home Certification Organization, and any change would ultimately need to be submitted and approved by EPA before being used for ENERGY STAR certification purposes. As detailed in the ENERGY STAR Certification System, EPA is happy to consider alternative quality assurance schemes so long as they ultimately deliver an equivalent level of certainty in the quality of ENERGY STAR certifications. The comments below are offered in the spirit of transparency to provide insight on how EPA would evaluate this proposal with respect to the requirements of the ENERGY STAR Certification System, should it ultimately be submitted for requested use with ENERGY STAR certifications.

From EPA’s perspective, it is relatively straightforward to justify the equivalency of an alternative scheme that holds the average rate of quality control review constant but redistributes the incidence of that review in a more flexible manner, as this draft proposes to do with field review. On a conceptual level, EPA is supportive of such an approach, assuming that the distribution methodology is equitable and defensible, and appropriate safeguards and minimum backstops are in place.

It is a more complicated task to justify equivalency of a scheme that reduces the average review rate below the ENERGY STAR Certification System’s default of 1% field and 10% file review. When EPA has approved alternative schemes in the past, it has expected alternate measures to offset any reductions in the review rates, such as centralized collection of the ENERGY STAR checklist and larger quantities of photos collected per home. As currently drafted, this proposal would result in a lower file review rate without identifying any new, offsetting elements.

In addition, by offering Providers a choice between the historic flat 10% rate and the new formula (presumably, whichever is most beneficial), this proposal would result in a selective lowering of the bar for a subset of participants. It is hard to see how such a scheme could meet the ENERGY STAR Certification System’s intent of fairness and consistency.

EPA offers the following detailed comments.

If the standards endorse a new file review rate calculation, it should be applied consistently to all participants/ratings.

Taken by itself, the 95% confidence calculation actually requires more file reviews for most organizations than the historical 10% rate. The break-even point is approximately 4000 ratings per year. Using RESNET’s calculator, here are some examples of the effect of the organization size on the file review rate under the new formula:

Example Review Rates Under New Formula
"Population" Size File Reviews File Review Rate
100 ratings/year 80 file reviews 80%
500 ratings/year 217 file reviews 43%
1,000 ratings/year 278 file reviews 28%
5,000 ratings/year 357 file reviews 7%
10,000 ratings/year 370 file reviews 4%

Applying this new formula across RESNET’s membership would likely result in an average file review rate of at least 10%, but in a different distribution across companies. However, the proposal is not to adopt the 95% confidence formula across the board, but to make it one option, along with the flat 10% rate. By providing an either/or choice, the effect will not be to redistribute file reviews, but to selectively lower the file review rates for larger Providers. EPA does not take a stance on whether RESNET should maintain the current flat review rates or introduce a new calculation. However, in lieu of a justification for a selective policy, the principles of fairness and consistency dictate the need to pick one or the other formula and apply it uniformly to all participants and ratings.

Ratings produced by different rating companies cannot be assumed to be similar and should be considered separate “populations”.

A statistical “population” is a set of similar items of interest, which, in this case, are energy ratings. In the draft proposal section 904.3.2.3.2, the populations are effectively defined as the combined annual energy rating count for all HERS Raters operating under a Provider. However, EPA believes it is invalid to assume similarity between ratings occuring under different rating companies. There can be significant variations between rating companies’ internal culture, policies, practices, equipment, and so on, even when operating under the same Providership.

There are two reasons this is important. First, defining the populations at the Provider level will result in a lower quantity of review to reach a certain nominal confidence level. If the population sets are in fact composed of dissimilar items, the result will be too little review and, therefore, a too-low confidence level achieved in practice. Second, defining the population across all rating companies without minimum backstops for each company risks “under-sampling” certain rating companies, and potentially missing company-specific QA issues. In summary, the current proposal creates the potential for too few file reviews, and for those reviews to be ineffectively distributed.

“Populations” of ratings should be further seperated by division, for larger organizations.

When large companies (which, at the scale of 4000+ annual ratings, are typically combined Rater/Provider companies) serve multiple markets, EPA has historically observed variances in internal culture, policies, practices, tools, and equipment between markets. These variances may be influenced by corporate structure, division managers, QADs, acquisition history, regional builder practices, and other factors. Therefore, it should not be assumed that ratings produced by separate regional divisions of large organizations are similar.

For that reason, the “population” should be limited to ratings produced by a particular division. EPA acknowledges that “division” is not a defined term nor a collected datapoint and may be difficult to define in a way that captures the intent without being easily circumvented. In lieu of a better option, EPA suggests using the ratings’ state as a proxy for rating company division.

EPA supports flexibility in distributing field review, with caveats.

While EPA supports a more flexible approach to targeting the Field QA in concept, it recognizes other commenters feedback on potential drawbacks of the particular methodology proposed. While EPA does not take a position on these comments at this time, it does believe that, at a minimum, the comments warrant in depth consideration and response.

Finally, EPA notes that it had already planned to update the ENERGY STAR Certification System this year to establish a minimum backstop field QA rate that would be required under all ENERGY STAR certification program QA schemes. EPA intended to propose a backstop of 1 field QA per rater, per year as a starting point for discussion but will be soliciting feedback on this topic and appreciates RESNET’s input.

Proposed Change:

If the standards endorse a new QA rate calculation, it should be applied consistently to all participants/certifications.

EPA does not take a stance on whether MINHERS should maintain the current flat QA rates or introduce a new calculation. However, for the sake of fairness and consistency, EPA believes it will be necessary to pick one or the other. The specific edits required will depend on which approach RESNET wishes to take.

Ratings produced by different rating companies cannot be assumed to be similar and should be considered separate “populations”; “Populations” of ratings should be further seperated by division, for larger rating companies.

EPA suggests the following edit as one way to address these two comments:

904.3.2.3.2 Based on a total population defined as the annual rating count produced by a rating company in a state the Provider’s combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry  (See below informative note

Note that “rating company” is not a defined term in either MINHERS nor ANSI-301, though it is a collected datapoint in the RESNET Registry schema. EPA suggests formally defining this term in the standards and would leave the task to the subcommittee.

EPA supports flexibility in distributing field review, with caveats.

EPA suggests the subcommittee add a minimum backstop per rater, per time period (to be determined). See prior comments for additional context.

On an editorial note, EPA suggests restructuring this section to move the definition of the "top 50%" rule (or whatever final rules result after addressing the methodology feedback from other commenters) earlier in the section, as opposed to burying it in the exception/exemption. For example purposes, here is the suggested edit using the current draft:


904.3.3.2.1.1 Flexibility to choose which Raters to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing Raters to receive more field QA reviews and higher performing Raters to receive less. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of Raters in the Providership are allowed to have lower than 1% field QA rates. Raters who fall into the category of “field only” for their QA reviews are evaluated with the RFIs.

904.3.3.2.1.1.1 Backstop: All Raters would receive a minimum of one field QA review per year

904.3.3.2.1.1.2 Exception: Raters performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:

a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of Raters in the Providership are allowed to have lower than 1% field QA rates. Raters who fall into the category of “field only” for their QA reviews are evaluated with the RFIs; AND

ba. They received at least one field QA review the previous calendar year; AND

cb. During the year of the exemption, the Rater shall receive 100% or 15 full file QA reviews, whichever is less.


Comment #11

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: multiple
Comment Intent: Objection
Comment Type: General

Comment:

Based on the submitted public comments and informal feedback we have received, it is clear that there is a desire for increased rigor as a trade-off for the potential decrease in total file QA reviews as proposed.  In short, the sentiment is that if fewer reviews are being done, there should be something extra required of participants that results in higher consistency and quality.  This propsed changes aims to ensure that the overall program does not suffer any loss of quality, even if some Providers are able to comply with fewer total fiel QA reviews.  One obvious step to ensure Providers and QADs are consistently and accurately completing file QA reviews and collecting all of the necessary supporting documentation.  RESNET currently does audit reviews as part of ongoing "enhanced Quality Assurance oversight" of Rating Providers, but the number of audits per year is limited due to limited staff.  In order to increase the number of reviews subjet to audit without increasing fees, a peer review program is being proposed here.  The idea of peer reviews is not new, but this was impractical in the past because there was no central repository where all reviews were stored.  At the time of the original draft amendment, it was unclear if/when RESNET would have such a central repository with "real-time" tracking of file QA reviews.  Since that time, RESNET has been developing improvements that will make this possible by the time this amendment would be mandatory.  Given those improvements, it will be possible for RESNET to have access to file QA reviews as they happen, and this will make the proposed peer review program possible.

Proposed Change:

904.3.2.3 Providers shall calculate the number of file QA reviews required annually using one of the following methods:


904.3.2.2.23.1 For each HERS Rater, the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of the greater of one (1) rating or ten percent (10%) of the HERS Rater's annual total of Confirmed, Threshold or Sampled Ratings. When determining the number of ratings to review for a HERS Rater, round up to the next whole number when the percentage calculation yields a decimal point, e.g. 101 ratings x 10% = 10.1 means that 11 ratings shall be reviewed. OR;


904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry  (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.


904.3.2.3.2.1 Provider Qualifications.  In order to qualify for the alternative method described in 904.3.2.3.2, ongoing participation in a peer review program implemented by RESNET is required.


904.3.2.3.2.1.1 RESNET shall implement a peer review program to facilitate independent, duplicate QA File reviews by Quality Assurance Designees in different Provider organizations.  The data shall be collected by RESNET and sanitized of identifying characteristics such as address, builder, rater, and Provider.  Participating Providers must submit relevant supporting documentation to RESNET within five (5) business days upon request.  Participating Providers must complete peer reviews on assigned, sanitized file QA rating data files and return them within ten (10) business days.  The minimum number of peer reviews requested and assigned shall be based on one percent of the Provider’s total rating count for the previous quarter. 


One or more instances of Significant Scoring Differences as compared to peer reviews OR an ongoing pattern of errors shall result in an increased rate of peer reviews and may lead to disqualification from the alternative method described in 904.3.2.3.2.


RESNET shall audit peer reviews with Significant Scoring Differences and shall arbitrate any disagreements in the application of the file QA review checklist.a percentage of the number of file QA reviews completed by the Provider.


Significant Scoring Differences for peer reviews shall be defined as >15% difference in the assigned percentage score OR errors that change the final pass/fail result.


Comment #12

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: N/A
Comment Intent: Objection
Comment Type: General

Comment:

In reading RESNET's last proposed update which would add a "peer review" program to the MINHERS standards, I find an incredible risk of violating antitrust laws.

Yes, the data RESNET provides to a participating QAD would be sanitized of identifying information, but you're also requiring supporting documentation, which frankly includes things like plans, specifications, etc from the builder. That takes time to sanitize that information from these supporting documents.

In creating this "peer review program", you're creating more work for RESNET Staff after already admitting to not having the staffing to do the work that needs to be done NOW.

On top of that, how do you propose paying the participating Providers, or their QADs, for the additional work they have to do? Why would they sign-up for a "peer review program", which would require additional work for the company that doesn't add to their profit margin?

Beyond that, what happens if a provider decides to optout in the middle of the year? How does this peer program work? Would these reviews be done before registration, or after registration? Doesn't this lead to further delays in the certification of homes?

This whole proposal is more holey than a block of swiss cheese.


Comment #13

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 2, 6, 7
Paragraph / Figure / Table / Note: 904.3.2.3.2, 904.3.3.2.1.1, 904.3.3.2.2
Comment Intent: Objection
Comment Type: Technical

Comment:

Building Efficiency Resources is opposed to reducing the existing QA process as proposed in MINHERS® Addendum 72, File and Field QA. The primary reason for objecting to the proposed amendment is due to a lack of substantial basis for a need to change. There may be as much or more information to support increasing the amount of QA as there is in reducing it. QA should only be reduced if we have a basis for acceptable accuracy and consistency, and it has been determined that we are meeting that criterion. To defend reduction of QA efforts, there must be tangible evidence supporting a reduction. Simply declaring the process to be burdensome is not enough; we must identify the nature of the burden and address that first. We are unaware of specific details that show the QA across the industry is accurate or consistent to the extent that we may relax the requirements.

A secondary reason is our industry has successfully implemented a simple approach to QA, that being 10% of all Rating Files and 1% of all Field inspections. While the tool provided by RESNET is simple in that it only requires entering one number, the complexity comes in the flexibility of having multiple Raters/RFI on different scales. A simplistic approach can be used to satisfy both the need for accuracy and consistency and reduce the burden associated with maintaining compliance with the Standards. Additionally, the proposed alternate solution appears to provide little to no benefit to small providers. The point at which reduced QA is allowed based on the tool provided starts at 3465 ratings. At that point the benefit the QA essentially grinds to a halt effectively as the percentage of QA file reviews about that point are less than 1%, which means more typos or human error will go unnoticed.

QA reviews need to be more than just reviewing, identifying discrepancies, correcting the errors, and submitting the file. Our emphasis, and our obligation should be to correct the behaviors that are the root of the errors, mistakes, or issues identified during the QA reviews. Our industry is growing and expanding and needs all existing trained and certified personnel. Correcting bad behavior and motivating them to change, to get better and to excel in their career will prove to be fruitful for Providers and the industry.

As stated, QA should only be reduced if we have a basis for acceptable accuracy and consistency. This can only be determined if we have a “scorecard” that shows where we stand as an industry, and shows either a steady, improving, or declining rating quality. The QA Review Checklist has provided a common location and details on how to “score” each area, but an Excel spreadsheet, especially when dealing with hundreds of thousands of ratings, is not the ultimate way to manage that amount of data and offers a burdensome approach to keeping up with the progress of our industry. We believe there is much done by using computer analysis of the data. This will require a common database structure that will support analyses and comparisons of the data.

Obtaining the needed information starts with having accuracy and consistency among the QAD’s performing the reviews. We are unaware of that being something currently measurable. The QA Checklist is a great tool that can likely help us get to that point. Well defined criterion is needed for such scoring/grading, recognizing that some flexibility and judgment will be required. Nonetheless, some information/data is better than none.

For additional information, but not specifically addressing this Addendum, we have also provided a structure within the QA Checklist that accumulates scores and grades for not only the full checklist, but also individual areas to focus on where the issues are occurring across the industry. Those sub-areas are Project, Envelope, Fenestration, Mechanical, Testing and Lighting/Appliances. Each area has a percentage of the correct responses and provides both a score and a letter grade (A-F) which gives each QAD/Rater/RFI/Modeler something they already understand and can easily grasp where they stand with respect to expectations.

A complex structure/process is not needed. An efficient/simple process for data evaluation, management and processing is one that will reduce the burden and provide a path to possible   reductions in the future. We are opposed to changing the current process until such time that sufficient data is collected to demonstrate the basis for QA reductions and provides insight into specific areas should they exist where QA may still be required at the current levels.  We are also opposed to the alternative statistical approach using the RESNET Tool available and the “flexibility” methods since we believe it will drive inconsistency among providers.

However, to support requests for improvements to the process, we have provided an alternative approach based on Rater/RFI performance that would give Providers an option to reduce the requirements.  Below are our proposed changes to Addendum 72.

 

Proposed Change:

904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population),Raters score from all QA reviewed ratings from the previous year the Provider's Quality Assurance Designee shall be responsible for an annual QA File review based on the average score of the rater from the previous year on QA File review from the QA Checklist. If the Rater average score is equal to or greater than 90%, and has had no scores below 80%, then the number of file reviews may be reduced to 8% of the Raters total. If the average score continues equal to or greater than 90% and has had no scores below 80% through the year, the number of reviews may be reduced to 5% for File reviews for the subsequent Year. If at any time during Quarterly reviews the average score falls below 90%, or has had some scores below 80%, the Rater will revert back to the 10% rate. Once two consecutive Quarters of equal to or greater than 90% average score with no scores below 80%, is demonstrated by the Rater, the reduced numbers from above may be continued. All Raters shall have a minimum of one (1) Field review each with year. of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry  (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

904.3.2.3.3 To be eligible for this alternative, Raters must have been certified for a minimum of three (3) years and has completed 200 homes minimum (resulting in 2 field reviews and 20 file reviews minimum).

904.3.2.3.4 If the basis for the average score is limited to a specific type of rating (Single Family, small production homes, Multi Family, etc.) or Programs (HERS, ENERGY STAR, EPA Indoor airPLUS, DOE Zero Energy Ready Homes, etc.), then ratings outside of the experience of the Rater would require QA of 10% for Field/File reviews until it is demonstrated the Rater achieves equal to or greater than 90% on a minimum of 25 homes (requiring 1 field and 3 file reviews). At that point the Rater could be placed on a reduced QA as stated above.

 

904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population),Raters score from all QA reviewed ratings from the previous year the Provider's Quality Assurance Designee shall be responsible for an annual QA File review based on the average score of the rater from the previous year on QA File review from the QA Checklist. If the Rater average score is equal to or greater than 90%, and has had no scores below 80%, then the number of file reviews may be reduced to 8% of the Raters total. If the average score continues equal to or greater than 90% and has had no scores below 80% through the year, the number of reviews may be reduced to 5% for File reviews for the subsequent Year. If at any time during Quarterly reviews the average score falls below 90%, or has had some scores below 80%, the Rater will revert back to the 10% rate. Once two consecutive Quarters of equal to or greater than 90% average score with no scores below 80%, is demonstrated by the Rater, the reduced numbers from above may be continued. All Raters shall have a minimum of one (1) Field review each with year. of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry  (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

 

Informative Note 1 - The RESNET National Building Registry will calculate alternative file QA requirements as shown by the example calculator found on the RESNET website at: https://www.resnet.us/about/standards/minhers/tools. The sample size required for 95% confidence level using standard population proportion of .5 or 50%.

 

904.3.3.2.1.1 Based on the average score of the rater from the previous year on QA Field reviews from the QA Checklist, if the Rater average score is equal to or greater than 90%, and has had no scores below 80%, then the number of field reviews may be reduced to 0.8% of the Raters total. If the average score continues to be equal to or greater than 90% and has had no scores below 80% through the year, the number of reviews may be reduced to .5% for Field reviews for the subsequent Year. If at any time during Quarterly reviews the average score falls below 90%, and has had no scores below 80%, the Rater will revert to the 1% rate. Once two consecutive Quarters of equal to or greater than 90% average score with no scores below 80% is demonstrated by the Rater, the reduced numbers from above may be continued.

904.3.3.2.1.2 Backstop: All Raters would receive a minimum of one field QA review per year. For low volume raters (<101), the following Quarter of the subsequent year may be used to obtain the required Field QA.
 

Flexibility to choose which Raters to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing Raters to receive more field QA reviews and higher performing Raters to receive less.

904.3.3.2.1.1.1 Backstop: All Raters would receive a minimum of one field QA review per year

904.3.3.2.1.1.2 Exception: Raters performing 100 or fewer ratings may be exempt from field QA in a given year if they meet all the following criteria:
a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of Raters in the Providership are allowed to have lower than 1% field QA rates. Raters who fall into the category of “field only” for their QA reviews are evaluated with the RFIs; AND

b. They received at least one field QA review the previous calendar year; AND

c. During the year of the exemption, the Rater shall receive 100% or 15 full file QA reviews, whichever is less.

 

904.3.3.2.2.1 Flexibility to choose which RFIs to perform field QA on. Providers will be allowed to perform field QA based on 1% of their own Provider rating count, selecting the newer or lower performing RFIs to receive more field QA reviews and higher performing RFIs to receive less. Flexibility for reduced QA is allowed based on the Raters score from the QA Checklist AND the score of the Field related portion of the checklist applicable to the RFI. IF botheither Rater AND RFI have an average score of equal to or greater than 90% for the previous year with no scores below 80%, then the number of field reviews may be reduced to .8%. If the average score continues above 90% with no scores lower than 80% through the year, the number of reviews may be reduced to 0.5% for Field reviews for the subsequent Year. If at any time during Quarterly reviews the average score for the Rater OR the RFI falls below 90%, or has a score below 80%, the Rater AND RFI will revert back to the 1% rate. Once two consecutive Quarters of equal to or greater than 90% average score with no score below 80% is demonstrated by the Rater AND RFI, the reduced numbers from above may be continued.

904.3.3.2.2.1.1 Backstop: All RFIs would receive a minimum of one field QA review per year
904.3.3.1.2.1.2 Exception: RFIs performing 100 or fewer final inspections OR pre-drywall inspections (whichever is higher), the first Quarter of the following year may be used to obtain the required Field QA. may be exempt from field QA in a given year if they meet the following criteria:

a. They fall into the “high performing” category. Providers must evaluate the QA Review Checklists for the previous 12 months to determine the individual’s average score. The top performing 50% of RFIs in the Providership are allowed to have lower than 1% field QA rates.

b. They received at least one field QA review the previous calendar year.

c. During the year of the exemption, the Rater(s) of Record associated with that year’s total field work of the eligible RFI shall receive 100% or 15 full file QA reviews, whichever is less.


Comment #14

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 904.3.2
Paragraph / Figure / Table / Note: 904.3.2.3
Comment Intent: Objection
Comment Type: General

Comment:

In order to ensure an equivalent rigor of the Provider quality assurance processes, we recommend requiring these six measures of any Provider to qualify for the alternative QA percentage calculation method defined in 904.3.2.3.2.

Proposed Change:

904.3.2.3 Providers shall calculate the number of file QA reviews required annually using one of the following methods:

904.3.2.2.23.1 For each HERS Rater, the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of the greater of one (1) rating or ten percent (10%) of the HERS Rater's annual total of Confirmed, Threshold or Sampled Ratings. When determining the number of ratings to review for a HERS Rater, round up to the next whole number when the percentage calculation yields a decimal point, e.g. 101 ratings x 10% = 10.1 means that 11 ratings shall be reviewed. OR;

904.3.2.3.2 Based on the Provider’s combined rating count for all HERS Raters (total population), the Provider's Quality Assurance Designee shall be responsible for an annual QA File review of a statistically significant number of ratings sufficient to achieve a 95% confidence level in the results, as calculated by the RESNET National Buildings Registry  (See below informative note 1). Ratings shall be selected for QA File review according to 904.3.2.7.1 AND the QAD shall ensure that a QA File review is completed on a minimum of one (1) rating on a completed home per year for each HERS Rater.

904.3.2.3.2.1 Provider Qualifications.  In order to qualify for the alternative method defined in 904.3.2.3.2, a Provider must implement and maintain all six of the following measures. Compliance will be demonstrated via QA review reporting to RESNET:

1. Provider uses an automated QA/data analytics system to evaluate rating files for accuracy and/or errors. Automated QA/data analytics system must be reviewed and approved by RESNET Staff. RESNET Staff review can be accomplished through Provider demonstration showing RESNET Staff the user (input) side of such systems through a video/screenshare platform (Zoom, GoToMeeting, MS Teams, etc.) which would not require RESNET Staff to be given full access to the application or any back-end programming.

2. For each QA Field review, QAD verifies and documents all available minimum rated features via time stamped, geo-tagged photos or screenshots. QAD has to follow the same photo requirements as the Rater/RFI at that stage of construction, as specified by the current MINHERS Chapter 9 and/or the current ANSI/RESNET/ICC 301. QAD photos are archived and subject to RESNET review for three years.

3. Each Rater/RFI performing pre-drywall inspections must receive at minimum one (1) pre-drywall QA Field review annually. This pre-drywall QA Field review may be done remotely, following the RESNET Remote QA Protocol. Compliance will be demonstrated via field QA reporting.

4. Every Rater/RFI receives a full written report from the QAD for each QA Field review. The report goes well beyond pass/fail and includes detailed findings, positive reinforcement where appropriate, and corrective actions and mentoring when necessary.

5. Method for tracking and verifying frequency and types of errors or failures in QA reviews for Raters and RFIs. Items tracked will be submitted as part of the annual QA report submission and addressed in the ongoing training/mentoring of Raters, RFIs and HERS Modelers.

6. Provider requires ongoing Training/Mentoring of active Raters, active RFIs and active HERS Modelers for a minimum four (4) hours annually, in addition to Professional Development specified in the MINHERS for recertification. Training should be relevant to the job of the Rater, RFI and/or HERS Modeler. Training needs may vary by organization or by individual based on the results of QA Field reviews, QA File reviews, automated QA tool results and/or other means. Training of Raters, RFIs and HERS Modelers by a RESNET Instructor, QAD, or lead Rater shall be specifically aimed at increasing the consistency and accuracy of ratings and may take different forms, including:

a. Field Mentoring

b. Plan Review Mentoring (for HERS Modelers)

c. Team Meetings


Comment #15

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1-10
Comment Intent: Not an Objection
Comment Type: General

Comment:

Our company is a Provider that only works with our own certified HERS Raters (W-2 employees), and we have an external QAD.  For our arrangement, I think this draft standard could work reasonably well.  However, I have skimmed comments 1-14 and am reminded of the diverse business operations of others in our industry.  At a glance, I agree that this process probably won't work well for everyone.

Of note, compliance with 904.3.3.2.1 gets more expensive for a Rater/Rating Company the fewer their annual ratings, especially if they have an external QAD and/or Provider.  Smaller companies are more likely to have an external Provider and QAD.


Comment #16

Amendment: Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01
Page Number: 1-10
Comment Intent: Objection
Comment Type: Technical

Comment:

Taking the content of this addendum generally at face value, I'm submitting this as a group of technical objections because a number of questions came up for me that I think need to be minimally addressed.  I urge the committee to consider the committee's intent of these items.  I provided strikeout/underline formatting for the more straightforward ones.  Thanks for all of your work on this.

904.3.2 Section heading does not mention HERS Modelers.  Can it be shortened to remove HERS Rater?

904.2.2.8.1 Is "each new sampled community" intended to also mean "each new multifamily building/project/development"?  If so, some multifamily buildings are small and some multi-phase developments go on for 10 years.

904.3.2.5 states: "The HERS Modeler QA File reviews may fulfill all of the HERS Rater's annual QA File review requirement."  Clarify that the Modeler QA and Rater QA counts are for a given Confirmed/Threshold/Sampled Rating, not throughout the company.

904.3.3 Section heading does not mention RFIs.  Can it be shortened to remove HERS Rater?

904.3.3.2.2 states: "The RFI QA Field reviews may fulfill all of the HERS Rater's annual QA Field review requirement, but only if the HERS Rater does not do field inspections on completed homes."  If "on completed homes" is removed, does it have the same intent for you?  It sounds to me like "on completed homes" is explicitly excluding pre-drywall inspections.  Clarify that the RFI QA and Rater QA counts are for a given Confirmed/Threshold/Sampled Rating, not throughout the company.

904.3.3.2.2.1.1.c This section on RFIs references "full file QA reviews" and therefore needs to be amended.

904.3.3.2.6 states: "If at least two (2) homes are required for QA Field review, a maximum of one (1) of the homes SHALL be a non-tested, sampled home."  Shall means must.  Is this sentence intended to mean MUST or MAY?  I think it should say MAY.

Proposed Change:

904.3.2 HERS Rater Quality Assurance File review (QA File review)

904.3.3 HERS Rater Quality Assurance Field review (QA Field review).

904.3.3.2.2 The RFI QA Field reviews may fulfill all of the HERS Rater's annual QA Field review requirement, but only if the HERS Rater does not do field inspections on completed homes.

904.3.3.2.2.1.1.c During the year of the exemption, the Rater(s) of Record associated with that year’s total field work of the eligible RFI shall receive 100% or 15 full file QA reviews, whichever is less.

904.3.3.2.6 If at least two (2) homes are required for QA Field review, a maximum of one (1) of the homes mayshall be a non-tested, sampled home.


Return to Proposed MINHERS Addendum 72, File and Field QA, Draft PDS-01