Tools, Technologies and Training for Healthcare Laboratories

Quality of Proficiency Testing

The 2008 report from the CDC dared to ask the big question: is Proficiency Testing effective? Not only that, it dared to recommend a significant change in the CLIA statutes. Can you guess which law should be changed for better PT?

The Quality of Proficiency Testing
(or, the Quality of Laboratory Testing, revisited)

July 2008

Sten Westgard, MS


Back in 2006, the Division of Laboratory Systems at the Centers for Disease Control and Prevention (CDC) created a thirteen-member working group “to assess the effectiveness of clinical laboratory proficiency testing for regulatory, educational, and quality improvement purposes.”[1] [report, p.v]

This group was charged with the following tasks:

“1. Report on the status of current clinical PT programs in the US, assessing the success of PT in improving the quality of clinical testing and identifying areas in which improvements are needed.

2. Make recommendations for improving the effectiveness of PT in meeting regulatory, educational and quality improvement objectives.

3. Solicit and consider input from stakeholders including PT providers, Pt users in several types of clinical laboratories, and accrediting organizations; assess the feasibility of recommendations.

4. Identify needed improvements to PT that could be addressed during the next five years.”[2]

Now that we’ve reached 2008, that CDC Report on Proficiency Testing is out (read the whole thing here). Clinical Lab News produced an excellent overview of the report in their June 2008 issue (read it here).

We’re not going to recap the entire report, although it’s worth a good look. Instead, we’re going to hone in on their first task, quoting a good number of sections from key passages. Assessing the success of PT is obviously the most important priority, but it’s even more important because there is a key assumption hidden in that task. Implicit in that objective is an assessment of past and current quality of laboratory testing. If you don’t know the quality of laboratory testing, how can you know if PT has improved it?

The existential question: does Proficiency Testing improve laboratories?

“Attitudes toward PT vary widely among laboratorians and responsible executives in hospitals and independent laboratories. Some regard PT as an integral component of total quality management, with substantial educational value, while others regard it as a requirement for certification with no substantial utility.”[3]

Today, we take proficiency testing for granted. It seems like it’s always been with us, like the sun or the air. PT is a regulatory requirement for many tests in US laboratories, so there isn’t often a discussion about its value. It’s just something that has to be done to – so even if you don't believe in it, it's a cost of doing business.

But if you move beyond the compliance mentality, there isn’t a strong scientific literature establishing the value of PT:

“Evidence concerning the value of PT programs in improving laboratory performance comes from a broad base of experiential evidence and anecdotal reports that indicated participating in PT improves laboratory performance. There is, however, no single study, much less a body of published evidence, which unequivocally demonstrates that participating in PT reduces the rate of errors in routine testing of patient samples. Despite years of experience with PT in clinical laboratories, the substantial regulatory apparatus required to administer the system, and the costs of PT, evidence concerning its effectiveness has not been pursued in a systematic fashion. This lack of clear evidence, which is characteristic of many aspects of laboratory medicine, is especially problematic in assessing the benefit of PT to medical care quality or patient safety in relation to the cost of PT to the health care system.” [4, emphasis added]

So, does the emperor have no clothes? Have we been doing PT all these years just to keep busy, to throw numbers on a chart, to give regulators and legislators something for their files, but give us only a false sense of security?

Well, it’s not really that bad.

How PT can improve laboratory performance

“Experience with PT unquestionably reduces PT failure rates. For example, data gathered by CDC after promulgation of the CLIA regulations showed broad improvements in PT performance by regulated laboratories.[5 ] Other studies also indicate that PT performance improves as laboratories gain experience with PT participation.[6-9]"[10]

The fact that the longer you do proficiency testing, the better you do at it, isn’t a great sign in itself. Laboratories might simply be “teaching to the test,” resulting in rising PT success rates, much as SAT scores gradually rose as students were increasingly better “prepped” for the test (and when the SAT changed its testing methodology, scores then fell).[11]

But the CDC PT report concludes there are more forces at work in PT performance improvement:

"These improvements are not merely evidence that practice makes perfect; several systemic effects also contributed to the reduced failure rates7: (1) elimination of chronic poor performers from the pool of laboratories participating in PT for a given analyte, or correction of chronic problems by laboratories that remained in the pool, (2) improved PT materials and report forms (3) familiarity with program by participants, (4) identification of problems with methods and their correction, (5) adoption of more accurate and reproducible methods, (6) generally improved technical education and technical performance. The relative contributions of these mechanisms are not completely clear."[12]

When you examine these different improvement factors, you can see that one key effect is that PT testing "thins the herd." That is, laboratories that fail on PT tests are more likely to drop out of the PT program and stop offering testing services. If this is proven true, that’s a good thing. PT should be weeding out the bad labs and making them cease operation (if they are unwilling to make improvements).

The fourth through sixth factors of PT improvement are what we’ve been looking for: evidence that participating in PT makes laboratories perform better, not only in PT, but also in routine testing:

"In laboratories that fail PT challenges or experience ‘near misses,’ investigation of PT results that are chronically worse than those of peer laboratories using the same method may turn up a systemic problem that, once corrected, will also improve the accuracy of routine testing. In other cases, chronically poor PT performance may motivate a laboratory manager to adopt a new instrument or method that is intrinsically more accurate or reliable. Improved education and performance may be the result of hiring more qualified personnel but may also result from an effective use of PT results and performance to educate laboratory personnel....

"True quality improvements may take place in response to PT failures. One CAP study reported that, when laboratory directors reviewed the causes of repeated unacceptable PT results, 50% of the investigations isolated and corrected problems related to instrumentation, methods, or other technical aspects of testing. Only 4.4% of the failures were attributed to the survey format or materials.[13] A 1994 study by the Wisconsin State Laboratory of Hygiene (WSLH)[14] found that laboratories demonstrated a pattern of improved performance after unsatisfactory PT performance brought attention to correctable problems. Meeting the CLIA PT requirements has been a prime motivator in improving laboratory performance.[15].... Investigation of PT errors can uncover… inadequacies in the laboratory’s QA program and lead to improvements in laboratory PT performance. ” [16]

While the report provides an impressive list of ways that PT can improve laboratory performance, we still need to see the evidence that it has improved performance.

What evidence is available that PT is effective?

"Historically, hospital laboratories and independent laboratories as a group have performed substantially better than POLs in PT.[17] Using PT data reported to CMS by PT program providers, a CDC review of the PT performance for laboratories in CLIA mandated PT programs demonstrated improvement in performance for most of the analytes/tests evaluated over a 12-year period (1994 - 2005) (unpublished data). In 1998, Stull[18] found disparate PT performance between traditional laboratories (hospital and independent laboratories [HI]) and alternative testing sites (all other testing laboratories [AOT], including POLs and ancillary health care providers).The aggregate rate of satisfactory test event performance for all regulated analytes, tests, and specialties was 97% for the HI group and 91% for the AOT group. In the ensuing decade, a broad and general improvement in PT performance has occurred in all types of laboratories, including smaller hospital laboratories and POLs. A 2007 publication by the American Proficiency Institute describes a 10-year study of PT performance by physician’s offices, clinics, and small hospital laboratories.[19] Failure rates for chemistry and hematology analytes declined significantly during the 10-year period. Failure rates for microbiology also declined but remained above 5% in 2004 for certain tests." [20]

By way of contrast, here’s a key passage of the 2006 GAO report on Clinical Laboratory Testing:

"CMS also pointed to the steady increase in successful proficiency testing across all labs as an indication of improvements in lab quality. Our analysis of proficiency testing results suggested that lab quality had not improved at hospital labs in recent years. CMS correctly noted that the overall proportion of labs with no test failures increased from about 88 percent in 1998 to about 93 percent in 2003—that is, fewer labs failed proficiency testing. However, by focusing on overall proficiency testing results, CMS data mask trends in failure rates for subsets of labs such as hospital labs. For example, from 1999 through 2003, the percentage of CAP-surveyed labs with proficiency testing failures increased from 4.1 percent to 6.8 percent; CAP generally inspects hospital labs.”[21, emphasis added]

Overall, the rates of PT failure have undeniably declined. There has undoubtedly been a correlation between quality and PT success, but we have not definitively proven causation. We still do not know if improve PT performance is simply because labs have learned how to do the PT surveys better, or if the labs have learned how to test better.

The Question Stands: Is Proficiency Testing Effective?

Ultimately, the CDC PT report concludes that another report needs to done.

"[C]omprehensive and systematic studies are needed that assess directly the linkage between PT performance and performance in routine testing of patient samples. Such studies are not merely of academic interest. Planned, prospective studies of a broad range of laboratories of all types are needed. These studies could track performance of the same group of laboratories over time to identify predictors of PT performance and effective strategies and practices for integrating PT performance into a broad quality management system. Monitoring data for the same group of laboratories would also help distinguish true improvements in PT performance and error rates in routine testing from artificial improvements due to drop outs by poorly performing laboratories. These surveys could identify areas in most need of improvement or additional research and analysis."[22] p.16] [20]

Certainly, there is value in proficiency testing. But the value needs better proof and studies need to be conducted which demonstrate not only the extent of the improvements but how PT drives those improvements.

The Elephant in the room: what about PT and Waived Testing?

Ironically, the most important finding in the CDC report is about tests that aren’t, for the most port, covered by proficiency testing: waived tests.

To understand this next part - let’s go back to the 2006 GAO report on Clinical Lab Quality:

"CMS also commented that the overall [proficiency testing] improvement cannot be dismissed as a result of some labs being granted waived status because the more dramatic improvements predated the recent increase in the number of waived labs. It further commented that removing waived labs from the data would not result in improved performance rates. First, the number of waived labs - those performing waived tests or provider-performed microscopy - increased by about 26,600 from 1993 though 1998 and then increased by another approximately 33,700 labs from 1998 through 2004. Second, CMS’s comment suggested that it had conducted an analysis of the impact of removing waived labs from the proficiency testing data. However, it did not provide any data analysis when we subsequently asked to see the evidence behind its assertion. COLA also addressed this issue, and did not challenge our conclusion that the decrease in proficiency testing failures for physician office labs might not represent an actual improvement in lab quality, but instead could reflect the fact that some problematic labs are no longer surveyed.”[23, emphasis added]

Keep in mind that of the approximately 193,000 registered US laboratories, 157,000 or 81% perform only waived testing and/or Provider-Performed Microscopy. Only 36,000 US labs performed non-waived testing and are required to participate in PT and subject to inspections. Of these non-waived labs, 19,700 (55%) are inspected by State Agencies under CMS guidance, 15,200 (42%) are inspected by professional organizations, and 1,100 (3%) are CLIA-exempt and subject to more rigorous state inspection programs. Also keep in mind that those 19,700 labs inspected under CMS guidance have also been allowed to implement CMS’s “Equivalent QC” procedures that reduce the minimum QC from 2 levels per day to 2 levels per week or even 2 levels per month. Thus CLIA regulations have effectively minimized the quality standards in 176,700 (91%) of US laboratories that perform testing today.

The GAO report rightly raises the concern that improvement in PT performance, particularly in small labs, may be driven more by the “thinning” effect, which we noted earlier. But their conclusion adds an ominous note: it’s not that the poor labs are getting out of the testing business altogether - they're just shifting over to waived testing instead.

There’s been an explosion of waived testing labs and even larger rise is new waived test methods. Since waived testing has relatively few requirements - just to follow manufacturer’s instructions, there is a lot less regulatory oversight of those labs and those testing processes. It's concerning to think that the worst labs are getting out of regulated testing and into waived testing. This migration concentrates the risk in an area where we can least monitor or mitigate it.

The CDC expert group seems to acknowledge this danger in what we consider its most ground-breaking recommendations:

"27. Develop a process to assure that all clinical laboratories, including those that perform waived tests, participate in PT. This recommendation requires a change in the CLIA statute (law) (Public Health Service Act: Section 353 [263a][d][2][C]) that specifically exempts waived laboratories from standards (i.e. QC programs, PT, and inspections)" [24]

This is the only statute change called for by the working group. It’s a strong statement that waived testing needs to be brought back into the quality fold.

Coda: Some Ancient History which seems to be repeating

In 2004, the Maryland General laboratory scandal raised more questions about laboratory quality and the effectiveness of regulators and accreditors of laboratories. The much-publicized Maryland General story jolted Congress into action, or at least the semblance of action. One of the outcomes of Maryland General was that Congress asked the Government Accountability Office to investigate CMS, CAP, JCAHO, and COLA. Congress specifically charged the GAO with determining:

"(1) the quality of lab testing; (2) the effectiveness of surveys, complaint investigations, and enforcement actions in detecting problems and ensuring compliance; and (3) th adequacy of CMS oversight of the CLIA program." [25]”

The GAO report came out in 2006, with stern conclusions about points 2 (“Oversight weaknesses mask quality problems”) and 3 (“CMS Oversight of CLIA is inadequate”) and a few practical recommendations.

But notice what was missing? Nothing on the actual quality of laboratory testing. The GAO concluded that “Insufficient Data Exist to Identify Extent of Serious Lab Quality Problems.“ because

“Determining the quality of lab testing is difficult because it is virtually impossible to crosswalk inspection requirements across survey organizations. Without standardized survey findings across all survey organizations, CMS cannot tell whether the quality of lab testing has improved or worsened over time or whether deficiencies are being appropriately identified.” [26]

Unable to provide an actual assessment of the quality of laboratory testing, GAO could only note that “Comprehensive analysis of the proficiency testing database is particularly valuable because it provides a uniform way to assess the quality of lab testing across survey organizations, which is not currently available for survey results.”

The GAO report didn’t do what it was supposed to do: assess laboratory quality. So it recommended (in fact, that was the number one recommendation on the report) that another report or study tackle this very important task.

Now, here we are with another report, produced under the guidance of CDC, our guardians of quality in laboratory testing. But the contract group and the expert advisors that produced the report didn’t have the tools to complete the job. They were empowered only to review current literature, not conduct actual research. Thus the question of the quality of laboratory testing in the US continues to go unanswered.


  1. James C Peterson, Robert H Hill, Robert S Black, James Winkelman, Daniel Tholen, Review of Proficiency Testing Services for Clinical Laboratories in the United States - Final Report of a Technical Working Group, Division of Laboratory Systems, CDC, April 2008, p.v. Accessed July 7, 2008.
  2. Ibid, p.v.
  3. Ibid, p.5
  4. Ibid, pp.13-14.
  5. Centers for Disease Control and Prevention. Clinical laboratory performance on proficiency testing samples – United States, 1994. MMWR 1996;45(9):193-196. Referenced in the PT Report.
  6. Leeber JC. Role of external quality assurance schemes in assessing and improving quality in medical laboratories. Clin Chim Acta. 2001;309(2):173-177. Referenced in the PT Report.
  7. Reilly AA, Salkin IF, McGinnis MR, Gromadzki S, Pasarell L, Kemna M, Higgins N, Salfinger M. Evaluation of mycology laboratory proficiency testing. J Clin Microbiol. 1999;37(7): 2297-2305. Referenced in the PT Report.
  8. Tholen DW. Improvements in performance in medical diagnostics tests documented by interlaboratory comparison programs. Accred Qual Assur. 2002;7:146-152. Referenced in the PT Report.
  9. Ehrmeyer SS, Laessig RH. Has compliance with CLIA requirements really improved quality in US clinical laboratories? Clin Chim Acta. 2004;346(1):37-43. Referenced in the PT Report.
  10. Op cit PT Report, p.14
  11. Karen W. Arenson, SAT Reading and Math Scores Show Decline, New York Times, August 30, 2006. Accessed July 8, 2008 (sub required)
  12. Op cit PT Report, p.14
  13. Tholen DW, Lawson NS, Cohen T, Gilmore B. Proficiency test performance and experience with College of American Pathologists' programs. Arch Pathol Lab Med. 1995;119(4):307-311. Referenced in the PT Report.
  14. Ehrmeyer SS, Burmeister BJ, Laessig RH, Hassemer DJ. Laboratory performance in a state proficiency testing program: what can a laboratorian take home? J Clin Immunoassay. 1994;17:223-230. Referenced in the PT Report.
  15. Ehrmeyer SS, Laessig RH. Effect of legislation (CLIA'88) on setting quality specifications for US laboratories. Scand J Clin Lab Invest. 1999;59(7):563-567. Referenced in the PT Report.
  16. Op cit PT Report, p.15
  17. Centers for Disease Control and Prevention. Clinical laboratory performance on proficiency testing samples - United States, 1994. MMWR. 1996;45(9):193-196. Referenced in the PT Report.
  18. Stull TM, Hearn TL, Hancock JS, Handsfield JH, Collins CL. Variation in proficiency testing performance by testing site. JAMA. 1998;279(6):463-467. Referenced in the PT Report.
  19. Edson DC, Massey LD. Proficiency Testing Performance in Physician's Office, and Small Hospital Laboratories, 1994-2004. Labmedicine. 2007;38(4):237-239. Referenced in the PT Report.
  20. Op cit PT Report, p.15
  21. GAO-06-416 Report, Clinical Lab Quality: CMS and Survey Organization Oversight Should Be Strengthened , June 2006, p.50
  22. Op cit PT Report, p.16
  23. Op Cit GAO Report, p. 50
  24. Op cit PT Report, p.46
  25. Op Cit GAO Report, p. 2
  26. Ibid, p.45