Tools, Technologies and Training for Healthcare Laboratories

Safety culture in the lab

You probably already know that Patient Safety is an important "new" concept in healthcare. It's a hot topic in articles, at conferences, and new institutions have been created and new laws have been passed just to promote Patient Safety. So where does the lab fit in?

Patient Safety is a no brainer. Whatever you may feel about Six Sigma, Lean, ISO, CLIA, CAP, JCAHO, and/or EQC, you can hardly object to Patient Safety. Patient Safety is as natural to healthcare as “First, Do No Harm.” Is there anyone out there who argues against Patient Safety? (The American Association of Increasing Patient Risk?) At last, everyone in healthcare can agree on a concept. Or at least, we can all agree on the idea that we should do no harm to the patient.

But Patient Safety isn't just a generic concept. The term has a specific definition, and denotes an distinct philosophy of management and organization, as well as a set of tools, processes, and prescriptions. The Patient Safety concept has a decades-long history outside of healthcare, where it cut its teeth on the challenges of aviation and nuclear power plant safety (just to name two). Outside of healthcare, this school of thought is often known simply as Safety Culture.

Safety Culture has come to healthcare as Patient Safety. And it has grown into one of the pillars of today's healthcare. Patient Safety articles regularly appear in the journals and the industry press [See “Five Years After 'To Err is Human': How can labs improve Patient Safety?”, by Kay Downer, Clin Lab News, Februrary 2005, or Dr. Michael Astion, MD, PhD's AACC Expert Access on Patient Safety (March 2005): http://www.aacc.org/access/safety/index.asp or look at the articles in the October 2005 Archives of Pathology and Lab Medicine: http://arpa.allenpress.com/arpaonline/?request=get-toc&issn=1543-2165&volume=129&issue=10 Both accessed 10/20/05].

JCAHO's annual Patient Safety Goals continue to grow and expand, addressing gross failures and promoting best practices. In fact, JCAHO has proposed to add Patient Safety Leadership Goals in 2007 . These goals provide a useful, concise definition of Patient Safety:

“In a culture of safety and quality, all individuals are focused on continuous excellence in performance; accept safety as a personal responsibility; and work together to minimize any harm that might result from unsafe or poor quality of care, treatment, or services. Leaders create this culture by demonstrating their commitment to safety and quality and by taking actions to ahcieve the desired state. In this culture, one finds team work, open discussions of concerns about safe, and the encouragement of and reward for internal and external reporting of safety issues. Although reckless behavior and a blatant disregard for safety are not tolerated, the focus of attention is on the performance of systems and processes instead of the individual. Organizations are committed to ongoing learning and have the flexibility to adapt to and accommodate changes in technology,science, and the environment.” [Proposed JCAHO Overview of 2007 Laboratory Services Leadership Chapter, http://www.jcaho.org/accredited+organizations/laboratory+services/standards/field+reviews/07_lab_ld_stds.pdf accessed 10/20/05]

Increasingly, JCAHO views hospital and laboratory performance through the lens of Patient Safety. And just a few months ago, Patient Safety passed a major milestone. It became the law of the land.

Patient Safety: It's the Law

On July 28, the Patient Safety and Quality Improvement Act of 2005 was signed into U.S. law. The main effect of this legislation is to encourage the voluntary reporting of medical errors, adverse events, and other healthcare mistakes, to be collected in national patient safety databases. The law also authorizes the creation of patient safety organizations, who will collect, maintain and analyze the patient safety databases. The government will then prepare annual reports and recommendations based on findings of those databases.

The goal of the patient safety database is to learn lessons faster and spread knowledge further. By determining how and why a catastrophe happened in one hospital, and quickly alerting other hospitals of the danger, we can prevent repeated medical mistakes. Furthermore, by encouraging confidential reports of medical errors and open discussion of those errors, a foundation can be laid for a culture of patient safety.

Previous to this law, there has been little incentive to report errors. Reporting an error, perhaps an error that no one yet knew about, opens up the individuals and the institution involved to legal action, as well as public and professional sanction. Now, when errors are reported to these patient safety databases, they will be exempt from legal action and those who report them will be protected from retaliation.

Lab Culture: “Is it Safe?”

Every organization has a culture. An organizational culture is a set of the beliefs, attitudes and priorities of the members and leaders of that organization. When someone new to the hospital or lab asks why you do something in a certain way, the culture is what guides your answer (“Because that's how we do things around here.”). Often there are multiple cultures happening in a single organization (opposing values in different departments, differences of thought between management and bench workers, etc.). From the perspective of safety culture, there are good cultural values and bad cultural values (i.e. There is no “I'm okay, you're okay” in organizational culture).

In the literature of Safety Culture, there is also a taxonomy of cultures, ranging from pathological to generative. Looking at the descriptions of these cultures can help us decide where our laboratories fit in the spectrum of culture.

Pathological: “We don't want to know.”
This culture is mainly driven by economic constraints, production pressures, and business concerns, where safety is is ignored, suppressed, or actively circumvented. Even under normal circumstances, the safety procedures are inadequate. When failures are discovered, the messenger is shot, individual workers are blamed, and the problems are often concealed. Overall, the culture shirks responsibility for safety and new ideas and solutions to safety problems are discouraged.
Calculative: “We know what we know, but we don't know what we don't know.”

Safety here is adequate under normal operating conditions (perhaps compliant is a term that applies here). Safety practices are implemented where mandated. Members accept responsibility for their part of the safety system. When someone discovers an error, the organization listens to the messenger, and attempts local repairs.

The problem with this culture is that it often fails in unforeseen circumstances. Sometimes the messengers do not arrive, or the disaster plans are not effective. In general, the fixes made after a failure tend to address a local detail, a single procedure or personnel issue, not the system. Members are compartmentalized and may not have the ability to implement a solution that crosses boundaries. When broad solutions to safety issues are suggested, they often pose problems due to the effort required to implement them.

Generative: “We know that we don't know everything, but we're trying to learn.”

This culture enshrines safety as a core principle of the organization, and has active participation at all levels. A great amount of energy is spent furthering safety goals, and targets for safety are set beyond ordinary expectations. Members are willing to do unexpected things in unexpected ways, value the results more than procedural forms, and even at the lowest level, are empowered to “see” problems and do something about them. All members are trained to be messengers, to seek out and find problems, and are rewarded for finding problems and proposing new ideas. Because of this high dedication and motivation, problems are discovered at earlier stages, before disaster strikes.

Another characterization of this culture is that it is in a state of “chronic unease” , actively seeking out bad news, wary of complacency, and never content to accept the status quo.

[These descriptions come from:
Westrum R. Organisational and iner-organisational though. World Bank Workshop on Safety Control & Risk Management, Washington DC, 17-18 October 1988, as quoted in James Reason, Human Error, Cambridge University Press, Cambridge, United Kingdom, 1990.
and
Westrum R. “Cultures with requisite imagination” in J. Wise, D. Hopkinson, P. Stager (eds.) Verification and Validation of Complex Systems: Human Factors Issues (Berlin: Springer-Verlag, 1992), pp.401-416, as quoted in James Reason, Managing the Risks of Organizational Accidents, Ashgate Publishing Limited, Hants, United Kingdom, 1997]

It's obvious that the Pathological organization is bad, the Calculative organization is at best okay, and the Generative organization is the ideal. It's also important to point out that this simplification assumes a single state across the entire organization. As noted before, within a single organization, multiple cultures can exist (I think the typical self-assessment might describe management as pathological, the laboratory as calculative, and the individual rating himself or herself, as, of course, generative.).

The main goal of these culture models is to identify bad practices and contrast them with ideal practices. Wherever pathological values exists, they should be modified to calculative and generative values.

What's more interesting is to set these values in a real-world context. For instance, what was the culture of Maryland General? Clearly, pathological values ran amok there. And yet, the whistle-blower displayed all the ideals of the generative culture (and was accordingly “shot” for it) in her efforts to expose, stop, and fix the problems.

Now let's bring the Maryland General situation into your laboratory – what actions did you take when you learned about Maryland General? Did you ignore what was going on there? Did you figure if you were in compliance, you were fine? Or did you perform a lab review to see if you were doing any similar practices?

Here's another, less charged, question: what are your criteria for purchasing new instruments and methods? Are quality specifications developed before purchasing? Are method performance characteristics determined before purchase, or are they done after the purchase by the manufacturers techs? Do speed and price factors overrule quality issues when purchasing an instrument?

And even better question is, how would your laboratory behave if a bench level technologist stepped up and said, “This method/instrument isn't safe. It is not providing the quality needed by the patient.” How would your laboratory respond? Are the words of your techs ignored? Are they heard but only partially responses are given? Would you ask the manufacturer for a replacement, but give up if they refused to give you one for free? Has a bench level technologist ever been rewarded for bringing an error to the administration's attention?

Now you know your culture.

Where does Patient Safety fit in the Quality Universe?

Now that we've been introduced to Patient Safety, let's place it in the laboratory context. In our universe of regulations, practices, certifications, systems, movements and fads, we locate Patient Safety near the very origin. Patient Safety is a “Big Bang” event, one so large that it encompasses such diverse galaxies as ISO, Lean, and Six Sigma, which are “merely” methods of achieving the goal. Statistical quality control, or QC in general, is a vital star in the Patient Safety universe, although surely not the only one.

When we boil this down, Patient Safety has always been with us. Quality in healthcare means patient safety. Patient safety in healthcare means quality. They are indivisible. Patient Safety has always been implicit in our thinking, but the literature, tools, and approaches of Patient Safety make it explicit and emphatic. It's a new path to a destination we've long been seeking.

The rise of Patient Safety allows us to admit an important failure of language. The weight and meaning of “Quality” has been worn out after decades of overuse and outright misuse. You can expect to see that every label, committee, conference and organization that once had “Quality” in the title will soon feature “Patient Safety.” Let us hope that the new terminology helps us to revive the pursuit of excellence in healthcare.