Tools, Technologies and Training for Healthcare Laboratories

Quality of HbA1c in 2014

Lenters-Westra and Slingerland have just published a new review of HbA1c POC devices. The good news: now 4 out of 7 methods meet the minimum analytical performance. The bad news: that's still 3 of 7 that don't. There's still a lot of room for improvement.

 

 

 

Another Teaching Moment: POC HbA1c Performance

James O. Westgard, phd
june 2014

A new report on the performance of POC HbA1c instruments has been published online by the Clinical Chemistry journal [1]. Authored by Lenters-Westra and Slingerland, this report extends their earlier work [2] and again points out that some POC instruments do not satisfy current performance criteria. Keep in mind that those performance criteria have gotten considerably more demanding from 2010 to 2014. Both NGSP certification and CAP proficiency testing criteria are now 6%, compared to the 2010 NGSP criterion of 0.75 %Hb (or 11.5% at 6.5 %Hb) and the 2010 CAP criterion of 8.0%. As discussed earlier on this website [3], the quality goals and performance requirements were not properly aligned, with the NGSP criterion being less stringent than the CAP criterion. Finally, in 2014, the NGSP and CAP criteria are consistent, both specifying an allowable Total Error (TEa) of 6.0%.

A Teaching Moment - 2010

Readers of this website may recall that the earlier Lenters-Westra and Slingerland paper was the basis for an extended “real-world” case study on method validation that involved a dozen lessons that progressed through the process of reading, reviewing, and evaluating published data on HbA1c performance [4].

  • Episode 1 identified an abstract from the 2009 AACC national meeting, a validation study published in the January 2010 issue of Clinical Chemistry, and an accompanying editorial in that same issue.
  • Episode 2 discussed the information available in the 2009 abstract and concluded that any decision on acceptable performance depended on knowing the quality needed for the clinical use of a HbA1c test.
  • Episode 3 reviewed the recommendations for quality requirements, as discussed by Bruns and Boyd in their editorial that accompanied the publication of the evaluation study by Lenters-Westra and Slingerland.
  • Episode 4 discussed the study by Lenters-Westra and Slingerland.
  • Episode 5, Dr. Craig Foreback provided a discussion of measurement principles to help us understand both the NGSP certification protocol and the use of 3 different secondary reference methods in the Lenters-Westra and Slingerland study.
  • Episode 6 provided a more detailed discussion of the statistical analysis of the replication and comparison of methods experimental results.
  • Episode 7 made use of error grids to compare different quality goals and requirements that might be applied to determine the acceptability of a HbA1c method.
  • Episode 8 discussed how to prepare a Method Decision Chart to help judge the acceptability of performance.
  • Episode 9 provided a method decision chart to summarize the acceptability of performance for all these methods.
  • Episode 10 reviewed the performance of other methods on the basis of CAP proficiency testing survey results.
  • Episode 11 summarized comments from Dr. Erna Lenters-Westra in response to these lessons and provided the investigators’ perspective from an IFCC HbA1c reference laboratory.
  • A later study from 2011 was reviewed to illustrate application of “Reference Change Values” to assess the capability for determining significant differences in patient monitoring [5].
  • Discussion of “Top of Diabetes Diagnostics” symposium that was held in conjunction with Dr. Erna Lenters-Westra doctoral defense [6].

A Teaching Moment – 2014

If you want to review and refresh your method validation skills, we’ll work through the validation data with you and demonstrate how to use the Method Decision Chart to judge the acceptability of performance of these POC instruments. Here’s the assignment.

  • Read the latest paper [1] from Lenters-Westra and Slingerland.
  • Review the important data for estimating precision (Table 1) and bias (Table 2).
  • Use the regression statistics for “Lot number 1” in Table 2 to calculate the expected Systematic Error, or bias, at the critical medical concentration of 6.5 %Hb.
  • Prepare a Method Decision Chart for the NGSP and CAP allowable Total Error of 6.0%.
  • Assess the sigma quality of the 7 POC instruments.
  • Identify those methods that could be adequately controlled to verify the attainment of the intended quality of test results.

You may want to review the previous series of lessons [4] to guide you through the method evaluation process. It is a process, which means there is a standard series of steps that should be followed to produce a standard outcome, i.e., you should all come up with the same assessment of sigma quality if you use the same data (Tables 1 and 2) and the same quality requirement (6% TEa). Note that you want to use the DCCT (Diabetes Control and Complications Trial) units of %Hb for the US marketplace. Data is SI units is also provided in Table 1 and Figure 1 (A, B, C, D) and Figure 2 (A, B, C), which will be of more interest in other parts of the world.

References

  1. Lenters-Westra E, Slingerland RJ. Three of 7 hemoglobin A1c point-of-care instruments do not meet generally accepted analytical performance criteria. http://hwaint.clinchem.org/cgi/doi/10.1373/clinchem.2014.224311
  2. Lenters-Westra E, Slingerland RJ. Six of eight hemoglobin A1c point-of-care instruments do not meet the general accepted analytical performance criteria. Clin Chem 2010;56:44-52.
  3. Westgard JO. Update on HgA1c quality goals and performance requirements. February, 2011. www.westgard.com/quality-hba1c-2011.htm.
  4. See Part XI for links to these lessons. www.westgard.com/hba1c-methods-part11.htm
  5. Westgard S. www.westgard.com/quality-hba1c-2011-part2.htm
  6. Westgard JO. Top of Diabetes Diagnostics Symposium. www.westgard.com/the-quality-of-diabetes-testing-2011.htm