Tools, Technologies and Training for Healthcare Laboratories

Tools and Technology for QC Planning

Just as instruments have evolved from manual pipettes to vast automated instruments, so too has the technology for quality control. When we're using a fourth generation instrument, should be be doing first generation quality control (like the old standby, 1:2s), a practice that was introduced in the 1950's? Dr. Westgard charts the history and progress of QC technology and introduces new tools (QC Selection grids, OPSpecs charts, automated QC selection) that we can use for the 21st century.

Banner green Belt 33

QC planning is the solution to a problem that some people don't even know they have! That's one of the difficulties in getting started with QC planning. Another is the time it takes to do QC planning. The time difficulty can be solved by application of appropriate tools and technology. However, you need to first recognize the potential applications to become convinced that it's important to learn how to do QC planning.

Quality planning

Although I generally use the term "QC planning" (to emphasize the selection of QC procedures to assure the quality required for the test), this is actually a more general "quality planning" process that has a wide variety of applications.

Laboratory applications: Selecting control rules and numbers of control measurements is, of course, an important application in a service laboratory. In addition, the performance needed by the method can also be determined if the QC procedures are given, which should be useful in establishing purchase specifications for methods, instruments, and systems. QC recommendations from manufacturers and QC guidelines given in the literature can be evaluated to be sure they are adequate for the quality required for the test and the analytical performance claimed for the method. It is also possible to compare allowable total errors, clinical decision intervals, and biologic goals to determine which are most demanding and should take priority in managing a testing process.

Manufacturers' applications: The design of new methods and systems should be greatly aided by a quantitative approach for setting performance specifications for imprecision and inaccuracy. QC recommendations can be objectively developed and validated based on regulatory quality requirements, the design specifications for imprecision and inaccuracy, and the common QC practices in the marketplace. Customers can be supported and assisted in the proper management of analytical systems by having a better understanding of the relationships between the quality required for a test, the imprecision and inaccuracy expected from a method, and the QC procedures to be implemented.

Regulatory and accreditation applications: It is interesting that there is no documentation of the source and origin of proficiency testing (PT) criteria, such as those allowable total errors specified by CLIA. Regulatory agencies and PT providers should evaluate the practicality of proposed PT criteria by comparison with clinical decision intervals, taking into account the common QC practices and expected method performance. Manufacturer's QC product labeling should be reviewed to assess whether those QC instructions are valid for the intended users. Laboratory QC practices should be reviewed to assess whether they are valid to assure the quality needed for the patient populations and clinical applications of a healthcare organization. Laboratorians and manufacturer's should be educated and supported to provide more optimal management of the analytical quality of their tests and systems.

Need for tools and technology

Another difficulty is that QC planning takes extra time; it doesn't replace something because we've never done it in the past. Yet, one of the benefits of QC planning is that it can save time (and money) by reducing false rejections and minimizing repeat analyses, as well as giving you the assurance that the test results are correct within your required quality. Most analysts, therefore, agree that QC planning should be part of good management practices in a laboratory, but it is still difficult to find the time to do anything new in today's busy laboratories. And, if something new is going to be added, it had better to quick and easy to do.

A laboratory's ability to do anything efficiently often depends on utilizing tools and technology to facilitate a process. Most laboratory procedures have evolved from an initial qualitative manual method that has then been systematized and made more quantitative with laboratory tools such as diluters and photometers, then automated through succeeding generations of technology until complete systems are available that are highly efficient and productive. QC planning, likewise, must evolve from a qualitative manual method to a systematic process that utilizes standard tools to a quantitative automated process that is quick and effective.

A little history of QC planning

QC simulation programs and power function graphs: Many years ago, when we first started studying the performance of QC procedures and assessing what rules and numbers of control measurements were appropriate for a test [1], it took a long time to do the necessary work because it usually involved performing computer simulations to determine power function graphs [2]. Working with Dr. Torgny Groth, Dr. Torsten Aronsson, and Professor Carl-Henric deVerdier at Uppsala University in Sweden, we developed an interactive QC simulation program [3,4] that could be easily used by a laboratory analyst to determine power functions for a wide variety of control rules. With the availability of this research tool, we established guidelines for improving laboratory quality control [5]. Working with Patricia Barry at the University of Wisconsin, we applied this tool in the laboratory and developed an approach for cost-effective management of laboratory testing processes [6]. During this time, a QC simulation program was the tool that was needed to do QC planning, but that tool was practical only in large laboratories that had resources for research and development.

Critical-error graphs: In the mid to late 80s, when I was involved with implementing TQM in the laboratory, I gained a greater appreciation of the importance of graphical tools to facilitate problem solving. This led us to start calculating the critical-sized errors to be detected by QC and imposing those errors on power curves, thus transforming power function graphs into critical-error graphs that were more useful and informative. We used this new tool in a quality improvement project to improve the cost-effectiveness of a multitest chemistry analyzer through careful optimization of QC design [7] (which is discussed elsewere on this website). We also started teaching this approach in workshops, but soon found that participants still didn't do QC planning when they got back in their laboratories. Universally, they said they just didn't have enough time to do this.

QC Selection Grids: Although it had taken us several months to select QC procedures for a multitest chemistry analyzer, Elsa Quam realized we could do QC planning more quickly for other analyzers by generalizing or summarizing our results. This insight came during a conversation on a bus trip from Madison to Milwaukee to attend a clinical chemistry meeting; we pretty much developed the concept of QC Selection Grids [8] during the return trip (see lesson on QC Selection Grids). These grids provided a "table look-up" tool for selecting a QC procedure on the basis of the calculated critical systematic error and the expected stability of the method, then relied on the analyst to use power function graphs to assess the probabilities for rejection. The difficulty, in practice, was that many analysts skipped the last step and therefore ran the risk of ending up with an inappropriate QC procedure. QCSGs, therefore, are quick and easy, but at best are a semi-quantatitive tool. It would be better if there were a tool that would build in the error detection capability out front, rather than depending on, or risking, an assessment at the end.

Quality-planning models and OPSpecs charts: In the early 90s, we formulated quality-planning models [9,10] and started using an electronic spreadsheet to prepare charts of operating specifications that showed the relationship between the precision and accuracy that were allowable and the QC that was necessary to assure a defined quality requirement would be achieved in routine testing [11,12]. OPSpecs charts provided a faster and more quantitative QC planning tool, but a reviewer of one of the initial papers commented that even though this was nice in theory, it would be impractical to apply in laboratories unless a computer program could be made available to perform the necessary calculations and prepare the charts.

QC Design software: the Validator computer program: It soon became obvious that the reviewer was right and that a special computer program would be needed to make this tool available. This led to the development of the QC Validator program to support QC planning and the preparation of graphics tools, such as power function graphs, critical-error graphs, and OPSpecs charts. The first version of this program prepared the graphs and charts based on both analytical and clinical quality requirements, but required manual operation to select an appropriate QC procedure. The second version added an automatic QC selection function that could be initiated by indicating the number of control materials to be analyzed, then the program makes the QC selection based on default settings for certain selection criteria and logic that reflect pretty much how I would do it myself. Note, however, that the selection criteria and logic can be edited by the user, so the program can be made to perform according to your own preferences, rather than mine.

Transformation of new knowledge to tools and technology

I will admit it's a bit scarey to realize that a computer program can do something that it has taken me almost twenty years to learn and understand. However, this represents the desired outcome of new knowledge, its application through tools, and its implementation through technology.

The development and application of new QC technology is really no different than what has happened for test methods. We learn the chemical principles of a glucose test and the proper way to perform the test, then we evaluate the technology that is available in a kit or in an instrument system. Once that technology has been demonstrated to work properly, we accept it as valid and make widespread applications. Laboratories today mainly purchase technology in the form test kits and analytical systems to get their work done efficiently.

The analogy with QC planning is that you must learn the principles and theory about power function graphs, critical-error graphs, OPSpecs charts, and quality-planning models, then practice using these tools to perform QC planning. You can prepare these tools from scratch or use available kits, such as the OPSpecs Manual or normalized OPSpecs charts, or available technology such as the QC Validator computer program. On the job, you will need to evaluate and adopt the most advanced tools and technology available to make the process as efficient as possible.

Getting started with the available tools and technology

Better tools and technology will be available in the future, but that won't help you design, operate, manage, inspect, or approve testing processes and systems today. Get started now and take advantage of new and better technology when it becomes available. Someday there will be automatic QC processes that reside in instruments, PC work stations, and information systems that will provide totally automated management of analytical quality. However, we need to learn to use the tools and technology that are available today to be prepared for the tomorrow.

References

  1. Westgard JO, Groth T, Aronsson T, Falk H, deVerdier C-H. Performance characteristics of rules for internal quality control: Probabilities for false rejection and error detection. Clin Chem 1977;23:1857-67.
  2. Westgard JO, Groth T. Power functions for statistical control rules. Clin Chem 1979;25:863-69.
  3. Westgard JO, Groth T. Design and evaluation of statistical control procedrues: Applications of a computer 'QC Simulator' program. Clin Chem 1981;27:1536-1545.
  4. Groth T, Falk H, Westgard JO. An interactive computer simulation program for the design of statistical control procedures in clinical chemistry. Computer Programs in Biomedicine 1981;13:73-86.
  5. Westgard JO, Groth T, deVerdier C-H. Principles for developing improved quality control procedures. Quality Control in Clinical Chemistry - Efforts to Find an Efficient Strategy, Scand J Clin Lab Invest 1984;44:Suppl 172:19-42.
  6. Westgard JO, Barry PL. Cost-Effective Quality Control: Managing the quality and productivity of analytical processes. AACC Press, Washington, DC, 240 p, 1986.
  7. Koch DD, Oryall JJ, Quam EF, Felbruegge DH, Dowd DE, Barry PL, Westgard JO. Selection of medically useful QC procedures for individual tests on a multi-test analytical system. Clin Chem 1990;36:230-3.
  8. Westgard JO, Quam EF, Barry PL. QC selection grids for planning QC procedures. Clin Lab Sci 1990;3:271-8.
  9. Westgard JO, Hytoft Petersen P, Wiebe DA. Laboratory process specifications for assuring quality in the U.S. National Cholesterol Education Program (NCEP). Clin Chem 1991:37:656-661.
  10. Westgard JO, Wiebe DA. Cholesterol operational process specifications for assuring the quality required by CLIA proficiency testing. Clin Chem 1991;37:1938-44.
  11. Westgard JO. Charts of operational process specifications ("OPSpecs charts") for assessing the precision, accuracy, and quality control needed to satisfy proficiency testing criteria. Clin Chem 1992;38:1226-33.
  12. Westgard JO. Analytical quality assurance through process planning and quality control. Arch Pathol Lab Med 1992;116:765-769.