In 1996, Diane Vaughan coined the term "Normalization of Deviance" to describe how NASA and its related industries rationalized their way into the decision to launch the Space Shuttle Challenger in 1986. Decades later, this safety concept still applies - and is very germane to many of our current QC practices in the laboratory.
The July 2011 Clinical Laboratory News had an excellent article on "The Slippery Slope of Errors" which discussed the 1996 book by Diane Vaughan, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. We covered this topic back in 2003 on the website and 2004 in the Nothing but the Truth about Quality manual. It's an important subject, and one worth re-emphasizing.
A more recent book was published in 2011 by Sidney Dekker, an expert in patient safety, human error, and organizational resilience. This new book, Drift into Failure, gives a very succint definition for the Normalization of Deviance:
"signals of potential danger... are acknowledged and then rationalized and normalized, leading to a continued use under apparently similar circumstances. This repeats itself until something goes wrong, revealing the gap between how risk was believed to be under control and its actual presence in the operation." [Sidney Dekker, Drift into Failure, page 106]
Basically, the production pressures of an organization can force an incremental drift away from safety. Each step is small, and during "good" (or lucky) times, operations can continue without bad consequences. But each step away from safety leads to greater danger, until finally an error occurs, and the organization finds it is much more vulnerable, more impacted, and less able to recover.
The important distinction is that this increasing vulnerability doesn't occur because of bad acts (neither the workers nor the managers intend to make things worse), nor does it occur as a truly rational choice. The organization doesn't make a specific announcement, "We're now going to trade safety in exchange for more efficiency." Instead, a slow creep occurs, where pressures to get things done motivate people to engage in workarounds, improvisations, and other acts to "get it done."
With the new book, Dr. Dekker provides a useful step-by-step of how this Normalization of Deviance takes place.
Perhaps the pattern already seems familiar. If not, let's put it into the QC context.
In this way, laboratories literally (actually) normalize deviance. Deviations are accepted as normal.
As we've said before, (and before and before), the use of 2 SD control limits is a MAJOR PROBLEM in laboratories. This tradition (and it is a tradition, not a practice based on science, it's really just a tradition handed down from the first generation of laboratory workers to the next) generates a lot of false rejections, causes a lot of bad responses, and can corrupt and corrode the quality system of the laboratory. The "Cry Wolf" effect can kick in, for instance - we get so accustomed to the outliers that we stop listening to them (and assume that every violation is a false rejection). Or we decide to artificially widen our limits until they are so wide we can't detect errors anymore.
[To see the extent and persistence of this problem, check out the article on The QC we really do from earlier this year.]
The patient safety experts and high reliablity theorists don't have a lot of optimism when it comes to the Normalization of Deviance. Given the inherent production pressures in any organization, particularly resource-constrained organizations, there isn't an easy solution. Dr. Dekker notes somberly that there should be a solution, but that it may not be possible in the typical organization:
"The solution to risk, if any, is to ensure that the organization continually reflects critically on and challenges its own definition of 'normal' operations, and finds ways to prioritize chronic safety concerns over acute production pressures. But how is this possible when the definition of 'bad news' is something that gets constantly renegotiated, as success with improvised procedures or imperfect technology accumulates? Such past success is taken as guarantee of future safety." [Dekker, ibid. page 108]
Thankfully, in the laboratory, we have a few options in our tool box, plus we can improve just by abandoning some of our antiquated tools. If we move away from 2 SD control limits, if we adopt a QC Design (Sigma-metric) approach, if we recognize that a "compliance" strategy is simply a race to the bottom, we can move away from deviance and start a path to excellence.
The important thing to realize is that this behavior is "normal" in organizations and if you find yourself with normalized deviance, it's not your individual fault. But once we recognize we're doing the wrong QC wrong, we do need start doing things right.