Our long journey towards a safety-minded Just Culture
Part I: Where we've been
From the September 7, 2006 issue
Punitive culture. Before the 1990s, healthcare providers often attempted to manage risk and errors by making frequent exhortations to work carefully and by retraining, counseling, or disciplining workers involved in errors, particularly those closest to the event. The prevailing thought at the time was that individual workers were fully, and sometimes solely, accountable for the outcomes of patients under their care, even if the underlying processes for achieving those outcomes were not under their direct control.
Perfect performance was expected and felt to be achievable through education, professionalism, vigilance, and care. The threat of disciplinary action for errors was thought to be necessary to maintain proper safety vigilance. Counsel-ing sessions after an error often focused on perceived weaknesses in the individual worker, with little thought to the system's contribution. Improvement strategies offered to the worker were often goal oriented—"follow the 5 rights," for example, when a medication error occurred—with little direction about how to achieve the goals or how to make safer behavioral choices.
In many cases, the severity of disciplinary action was determined by the severity of the undesirable outcome. Some believed that the consequences of even a single mistake were enough to justify punitive sanctions. Thus, workers who made an error that caused patient harm were often felt to be "justly" disciplined. Procedural violations were simply regarded as unacceptable, which offered little insight into their system-based causes. Many believed that "bad practitioners" were the cause of frequent or harmful errors, and that weeding out these individuals would result in a safer healthcare environment.
The truth is, the effect of such a punitive environment turned out to be the exact opposite of the intended result. Fear of retribution, ranging from undue embarrassment to employment and/or licensure termination, drove errors underground. Frontline workers were afraid to report their own errors or those of a colleague. Few even considered reporting near misses or hazards that could lead to errors, believing little would be done to avoid the inevitable. Instead, they created work-arounds in an attempt to avoid these minefields, if they were noticed at all. Middle managers and leaders grew content with believing that "no news" was "good news," thus missing enormous opportunities to learn about risks and implement robust system changes to reduce the chance of error.
Blame-free culture. By the mid 1990s, a culture shift that supported a "blame-free" or "no-blame" response to errors purportedly flourished in many healthcare organizations in response to the shortcomings of a punitive culture. Compared to the culture it sought to replace, it was clearly a step in the right direction.(1) It acknowledged human fallibility and the impossible task of perfect performance. It recognized that most unsafe acts were the result of mental slips or lapses, or honest mistakes that were rooted in system, process, technical, or environmental weaknesses that lay dormant in the organization until errors or proactive assessment efforts brought them to light.
In a "no-blame" culture, there was general agreement that even the most experienced, knowledgeable, vigilant, and caring workers could make mistakes that could lead to patient harm. There was recognition that workers who made honest errors were not truly blameworthy, nor was there much benefit to punishing them for these unintentional acts. Nevertheless, a 2001 ISMP culture survey to which more than 1,200 healthcare professionals responded made two things abundantly clear: while individual attitudes about errors, disciplinary action, and accountability for safety were beginning to move away from being overly punitive, they were not fully supportive of an industry-wide desire to become wholly blame-free. (For details, see our August 22, Sept. 5, and Sept. 19, 2001 newsletters.)
The problem is, the "blame-free" concept has a weakness—it fails to confront individuals who willfully (and often repeatedly) make unsafe behavioral choices, knowingly disregarding a substantial and unjustifiable risk that most peers would recognize as being likely to lead to a bad outcome.(1-3) While disciplining for honest mistakes is counterproductive, the failure to discipline workers involved in mishaps accompanied by truly reckless behavioral choices that endanger patients presents a valid objection to a wholly blame-free culture.(1-4) In these cases, sanctions of an appropriate severity may be warranted. Amnesty for all unsafe acts also lacks credibility and opposes many workers' sense of justice. Thus, a wholly blame-free culture is neither feasible nor desirable.(1-4)
Just Culture. There is a new, more just culture emerging in healthcare that addresses the weakness in a wholly blame-free approach and also runs counter to an overly-punitive culture. One of the leading authorities on the topic, David Marx, describes it this way:
On one side of the coin, it is about creating a reporting environment where staff can raise their hand when they have seen a risk or made a mistake. It is a culture that rewards reporting and puts a high value on open communication—where risks are openly discussed between managers and staff. It is a culture hungry for knowledge.
On the other side of the coin, it is about having a well-established system of accountability. A Just Culture must recognize that while we as humans are fallible, we do generally have control of our behavioral choices, whether we are an executive, a manager, or a staff member. Just Culture flourishes in an organization that understands the concept of shared accountability—that good system design and good behavioral choices of staff together produce good results. It has to be both."(5)
In Part II: Where we are going, which will appear in our next newsletter, we will further describe a Just Culture, its vast importance to patient safety, and its ever-gaining ability to finally cause a cultural paradigm shift in various healthcare arenas, including health systems, state departments of health, and professional licensing boards.
References. 1) GAIN Working Group E, Flight Ops/ATC Ops Safety Information Sharing. A roadmap to a just culture: enhancing the safety of the environment. September 2004 (http://220.127.116.11/products/documents/roadmap%20to%20a%20just%20culture.pdf). 2) Outcome Engineering. The just culture algorithm—version 1.0. Dallas, TX: Outcome Engineering, Inc.; March 2005 (www.justculture.org/downloads/jc_algorithm05.pdf). 3) Outcome Engineering. An introduction to just culture. Dallas, TX: Outcome Engineering, Inc.; 2005 (www.justculture.org/downloads/jc_overview.pdf). 4) Reason J. Managing the risks of organizational accidents. Hants, England: Ashgate Publishing Limited, 1997. 5) Marx D, Comden SC, Sexhus Z. Our inaugural issue—in recognition of a growing community. The Just Culture Community News and Views. Nov/Dec 2005;1:1.