Cost-Effective QC for Chemistry and Coagulation
Guest authors use QC Design software to customize their QC procedures for their laboratories.
Phoenixville Hospital, Phoenixville, PA
Washington Hospital Center, Washington, DC.
- Introduction: Questions about QC Validator and cost-effective QC
- The Answers
- Table 1: Data from Phoenixville hospital
- Computer simulation
- How gaussian is the algorithm?
- Table 3: Data from Washington
We have been working with the QC Validator computer software in an effort to design more cost-effective and simpler QC systems for our laboratories. During our work with the program we asked three questions:
- Will Validator help us design cost-effective QC programs in chemistry and coagulation?
- Once the new mean is found is it always necessary to return to the software and repeat the QC design process?
- When changing lots of reagents, how much data need be collected to establish a new mean?
QC Validator has helped us reduce the number of reruns and reduce the expenses and turn around the time when we use the 13s, the 12.5s or a multirule (13s/22s/R4s).
One of these control rules works for more than 75% of the chemistry and coagulations tests. Using retrospective QAP data for our chemistry and coagulation tests, we have found that while the mean sometimes changes with new reagents and calibration, the SD remains the constant. Thus the only variable in the calculations of critical systematic error (SEc) is the mean.
These same QAP data were then used to find the rang of means over periods of several months of control lots. We found that generally the same QC rules for 2 controls per run could be used if the largest bias for a year were used together with the usual SD even when the mean shifted. In essence this allows us to establish allowable limits for the mean and SD that permit us to keep the same rules in place and return to QC Validator only when those limits are exceeded.
Thus, QC Validator has aided us in reducing the nuber of repeats and reruns, reduced the turn around time and costs. Working with Validator has enabled us to reduce the cost of switching reagent or lots and enabled us to simplify the Validator by using fixed limits for our means and SDs.
Using our own data as well as computer simulation we have found that given CVs up to 00%, the mean changes little after eight replicates. Thus we have chosen to collect 8 data points for each of two levels of control by analyzing them four times at the beginning of a run and again four times at the end of the run when the new calibration or reagent lot is begun. This allows us quickly to establish the new mean and begin monitoring the system.
At the Pennsylvania hospital we used two lots (normal and abnormal) of commercial control (Biorad) for the chemistry study. The analyses were performed on a Dade Dimension.
At the Washington hospital we used two lots (normal and abnormal) of commercial control (Dade) for the chemistry study. the analyses were performed on a Beckman CX3 or CX7.
The urine control was from Biorad. The immunochemistry tests were performed on an Abbott TDX using Biorad controls. For the protimes and PTTs we used Dade Level I and III. The Analyses were performed on an MLA 1000 using thromboplastin C plus for PT and actin FSL for PTT. The computer simulations wree done using an algorithm for "Gaussian" data.
We examined chemistry and coagulation data from phoenixville hospital for a 10 month period to study how best to use QC Validator. The data indicate that most tests can use the 13s or 12.5s single rule. The tests for which a single rule is not appropriate, a multirule (a 13s/22s/R4s/41s, or 13s/22s/R4s/10x is adequate in most cases. Over the 10 month period the method was stable and no failures were noted.
We also compared the SDI with the SEc critical systematic error and found 6 situations of 44 where the SDI was less than 1.0 while the SEc was less than 2.5. This suggests that it is necessary to use more than the SDI to monitor an analytical system.