Tools, Technologies and Training for Healthcare Laboratories

Part V: A Few Bad Apples or the Tip of the Iceberg?

September 2004

After a second Congressional Hearing revealed further problems with the inspection process, the "deemed providers" of accreditation are recalibrating their inspections. Still the question remains: Was the scandal at Maryland General Hospital a case of a "few bad apples" or is just the tip of the iceberg?

Hear, Hear, Hear! Hearings on untruth and unquality!
Part V: A Few Bad Apples or the Tip of the Iceberg?

with Sten Westgard, MS

After the second hearing on “Ensuring accuracy and accountability in lab testing”, the status of the investigation was summarized by the AACC Government Affairs Update (August 12, 2004), as follows [1]:

“The chair, Rep. Mark Souder (R-IN), agreed some changes were necessary, but doubted that many of the problems at MGH were countrywide. CMS had expressed similar views at the initial hearing. In an election year, with neither CMS or FDA (the committees with direct jurisdiction over CLIA) indicating interest in the issue, and with no advocacy groups proposing changes, it would appear that the issue is unlikely to move forward this legislative session. One possibility, however, is that the committee could urge CMS and FDA to make some limited regulatory changes.”

Should we heave a sigh of relief that this issue appears to be behind us, or should we be concerned about conclusion of the investigation? Unfortunately there is no data provided that will support any objective decision about the adequacy of laboratory performance!

Will a CRS study address this issue?

At the hearing, Congressman Dutch Ruppersberger (D-Md) did raise the question whether the Maryland General incident might be indicative of a national problem [1]. With that intent, he requested an analysis from the Congressional Research Service (CRS) concerning “the questions Congress should be considering to assure quality in clinical labs” and included a CRS memo as part of the record of his statement. He emphasized that this memo be considered for guidance to “explore legislative options to address this important issue.” So, in the absence of any interest by CMS, FDA, or any of our professional organizations, CRS defines the considerations that might lead to limited regulatory changes.

Here’s the outline of the CRS questions, as provided in the August 2004 issue of Clinical Laboratory News [2]:

A. Defining the Scope of the Problem

  1. Is the current level of oversight for clinical laboratories appropriate?
  2. What is the best mechanism to facilitate further exploration of the sufficiency of the checks and balances on laboratory quality at the federal and state level?
  3. Should Congress establish a federal task force or an advisory committee comprised of stakeholders to report back to Congress?
  4. Should Congress commission an independent study?

B. Oversight and Coordination

  1. Should the coordination of lab inspections and accreditation be centralized at the federal level or decentralized to the state level?
  2. Should CMS be granted additional resources to deem more third-party organizations to perform inspections?
  3. Should inspections be announced or unannounced?
  4. Are inspectors adequately trained?
  5. Are employees adequately trained with respect to proper use of testing procedures and laboratory equipment?
  6. Should there be a mandatory reporting system for adverse events and/or deficiencies that exist outside of the inspection process?
  7. Should reports from lab employees that are sent to lab management be copied to regulatory agencies?

C. Compliance and Enforcement

  1. Should laboratory deficiency letters be made public?
  2. Does the federal government need to provide additional protections for laboratory employees who come forward with information to inspectors, regardless of whether they are state inspectors, CMS inspectors, or inspectors from a professional organization?

As we discussed in part IV in this series, the regulatory process is primarily dependent on laboratory inspections, which are known to be flawed and unreliable for identifying cases of poor performance.

  • Reorganizing the inspection process (B1) and adding more deemed organizations (B2) will do little to improve a basically flawed approach.
  • Inspector training is, of course, needed if inspection is to be the regulatory mechanism (B4) – one would expect that this should already have taken place, given it has been going on for over ten years now.
  • Personnel standards already exist under CLIA and laboratories are already required to have ongoing competency assessment (B5).
  • Supposed improvement in the inspection process by having unannounced inspections (B3) will likely create a more adversarial laboratory/inspector relationship and an even more fearful regulatory process and environment. Under these conditions, there’s little chance that employees will step forward to identify deficiencies, even with whistle blower protection (C2).
  • Improved reporting and improved communication of inspection results (B6-7, C1) may be helpful, but they do nothing to help uncover deficiencies, which is still the weakest step of the inspection process.

Therefore, further investigation must focus on the questions in part A above – the big questions. And the fundamental question is whether this is a case of a few bad apples (indicating random problems) or the tip of the iceberg (indicating systemic problems). If a few random problems, then the current level of oversight might be considered adequate (A1). If the problems are systemic, then a new and better regulatory approach must be developed (A2-4).

Random vs Systemic?

The CMS website [] provides a graphical summary of the”top four deficiencies cited” [], which are as follows:

  1. Quality Control - 2 Levels, i.e., not performing and documenting at least 2 levels of controls each day of testing.
  2. Quality Control – Follow Instructions; i.e., not following the manufacturer’s instructions for instrument operation and test performance.
  3. Quality Assurance – No Program, i.e., not establishing and following a comprehensive written quality assurance program to monitor overall laboratory operation.

Yes, those three are the top four deficiencies! We’re not making that up. Look at the graphic. Makes one wonder about the reliability of the information, doesn’t it?

An accompanying table expands the list and also documents the percentages of labs having different deficiencies.

  1. Not performing and documenting at least 2 levels of controls each day of testing, given as 5.1% of all labs and 4.9% of POLs in the 4th quarter figures in the CLIA UPDATE – December 2003.
  2. Not following the manufacturer’s instructions for instrument operation and test performance, given as 8.9% of all labs and 9.6% of POLs.
  3. Not establishing and following a comprehensive written quality assurance program to monitor overall laboratory operations, given as 7.8% of all labs and 7.4% of POLs.

NOTE: These numbers do not match with the numbers on the graph. There is a mixup between the deficiencies for "Quality Control - 2 Levels" and "Quality Control - Follow Instructions." One might suppose this is just a mislabeling on the graph, but there is actually a summary table "CLIA Update Statistics" that also has these numbers mixed up. We assume the more detailed table "CLIA Surveys: Top Deficiencies First, Second, Third, and Fourth Cycles" is correct, but who knows! In any case, the numbers are not good!

These figures suggest that from 5-10% of laboratories have deficiencies with the basic quality control/assurance regulations. Is that a few bad apples?

  • For the 21,224 laboratories included in these statistics, that means that 1000-2000 had deficiencies.
  • That would translate to 50,000 to 100,000 Defects Per Million (DPM).
  • That would translate to Sigma-metrics of 3.15 to 2.75 “short term” or 1.55 to 1.25 “long term.”

That’s not good! The number of defective laboratories, according to these inspection statistics, is unacceptable for roution operation and production. The absolute lowest sigma acceptable for production operations is 3.0 sigma, with the goal for “World Class Quality” being 6.0 sigma. If 5-6 Sigma were observed, then we might conclude there are a “few bad apples.” When only 3 Sigma is being observed, that suggests a much larger problem that is more likely systemic.

Do deficiencies equal bad results?

People will argue that a few deficiencies don’t necessarily mean that laboratories produce bad results - every laboratory will have some deficiencies when inspected. While that may be true, our concern is that these are fairly serious deficiencies, i.e., not running QC, not following the manufacturer’s directions on how to perform the test or operate the test equipment, and not having any quality system to monitor if the testing process and results are good or bad.

CMS will argue that these figures have been improving. True, but the numbers quoted here are the ones that represent the improvement. They may have been worse before, but they’re still not very good.

And some will say that the inspection process is flawed and doesn’t provide a reliable source of information on which to judge the quality of laboratory testing. We agree! The current evidence, as discussed above and previously in Part IV, may not be reliable. However, we suspect that the inspections underestimate the deficiencies, so things may actually be worse rather than better.

What other data is available for assessing test quality?

The CLIA regulations require that laboratories participate in proficiency testing. The correctness of test results are evaluated against “criteria for acceptable performance” defined in the regulations. Thus, there exist several databanks that might be used to determine the number of defective test results, which in turn would allow estimates of the sigma capability of laboratory testing processes.

There also exist databanks for “peer comparison” programs that can provide estimates of both the imprecision and bias of different test methods, as well as the sigma-quality of individual laboratory performance. Some of these databanks reside in professional organizations, such as CAP, and others reside with industrial suppliers of quality control materials and services. These databanks would also facilitate the estimation of the sigma capability of current testing processes.

Why Sigma?

Six Sigma quality management provides the framework and tools for truly evaluating the quality of processes against requirements for quality (tolerance limits such as the CLIA PT criteria for acceptability) [3]. Six Sigma also provides widely accepted goals for process performance (goal is 6-sigma capability for world class quality, minimum acceptable performance is 3-sigma). Process performance can be determined for pre-analytic and post-analytic processes using the DPM methodology [4]. Process performance can be determined for analytic processes on the basis of precision and bias that are observed. Performance can be benchmarked across processes and industries, allowing quality in healthcare to be compared with quality in any other industry.

We recommend that any new studies of laboratory performance make use of sigma-metrics to determine whether current problems represent only a few bad apples or the tip of the iceberg!


  1. AACC Government Affairs Update, August 12, 2004,
  2. Parham S. Hearings investigate cause of invalid HIV, HCV results: Are changes needed in lab inspection, accreditation processes? Clin Lab News August 2004.
  3. Westgard JO. Six Sigma Quality Design and Control: Desirable precision and requisite QC for laboratory measurement processes. Madison, WI: Westgard QC, Inc., 2001.
  4. Westgard JO, Ehrmeyer SS, Darcy TP. CLIA Final Rules for Quality Systems: Quality assessment issues and answers. Madison, WI: Westgard QC, Inc., 2004.

James O. Westgard, PhD, is a professor of pathology and laboratory medicine at the University of Wisconsin Medical School, Madison. He also is president of Westgard QC, Inc., (Madison, Wis.) which provides tools, technology, and training for laboratory quality management.