Featured Articles

Our Long Journey Towards a Safety-Minded Just Culture, Part II: Where We're Going

The culture in healthcare is evolving into one that is neither wholly punitive nor wholly blame-free when errors happen. As noted in Part I: Where we’ve been, the pendulum has swung from a “name, shame and blame” philosophy to an “amnesty for all” policy, and is now settling at a midpoint—a Just Culture—that is both fair to workers who make errors and effective in reducing safety risks.

In a Just Culture, all workers know that safety is valued in the organization, and they continually look for risks that pose a threat. They are thoughtful about their behavioral choices and always thinking about the most reliable ways to get the job done right. Managers are constantly looking for system design features that would give the workforce the best opportunity to perform well. While it is recognized that every endeavor carries the risk of human error, workers are held accountable for the things that are under their control: system design, particularly for the management and administrative team, and behavioral choices for the entire workforce.1

Three types of behavior can be involved in error: human error, at-risk behavior, and reckless behavior.2,3 Each type of behavior has a different cause, so a different response is required.

Human error. Human error involves unintentional and unpredictable behavior that causes or could have caused an undesirable outcome, either because a planned action is not completed as intended or the wrong plan is used to achieve an aim.1,4 Since most human errors arise from weaknesses in the system, they must be managed through process, system, or environmental changes. Discipline is not warranted or productive, because the worker did not intend the action or the risk or harm that resulted. The only just option is to console the worker and shore up the systems to prevent further errors.2 As Reason4 notes, we cannot change the human condition, but we can change the conditions under which humans work.

At-risk behavior. Everyone knows that “to err is human,” but we tend to forget that “to drift is human,” too.3 Behavioral research shows that we are programmed to drift into unsafe habits, to lose perception of the risk attached to everyday behaviors, or mistakenly believe the risk to be justified.5 In general, workers are most concerned with the immediate and certain consequences of their behavior—saved time, for example—and undervalue delayed or uncertain consequences, such as patient harm. Their decisions about what is important on a daily list of tasks are based on the immediate desired outcomes. Over time, as perceptions of risk fade away and workers try to do more with less, they take shortcuts and drift away from behaviors they know are safer.3

The reasons workers drift into unsafe behaviors are often rooted in the system. Safe behavioral choices may invoke criticism, and at-risk behaviors may invoke rewards. For example, a nurse who takes longer to administer medications may be criticized, even if the additional time is attributable to safe practice habits and patient education. But a nurse who is able to handle a half-dozen new admissions in the course of a shift may be admired, and others may follow her example, even if dangerous shortcuts have been taken. Therein lies the problem. The rewards of at-risk behaviors can become so common that perception of their risk fades or is believed to be justified.4

The incentives for unsafe behaviors should be uncovered and removed, and stronger incentives for safe behaviors should be created.3 The solution is not to punish those who engage in at-risk behaviors, but to uncover the system-based reasons for their behavior and decrease staff tolerance for risk taking. Once the incentives for their at-risk behaviors have been addressed, workers should be coached on making better behavioral choices. (Additional tips for reducing at-risk behaviors can be found in our September 23 and October 7, 2004 issues.)  

Reckless behavior. Workers who behave recklessly 

  • Always perceive the risk he or she is taking,
  • Understand that the risk is substantial,
  • Behave intentionally, but are unable to justify the behavior through objective risk–benefit analysis (i.e., do not mistakenly believe the risk is justified),
  • Know that others are not engaging in the same behavior (i.e., it is not the norm), and
  • Make a conscious choice to disregard the substantial and unjustifiable risk, for subjective reasons that do not meet the usual grounds of social utility.2

The difference between at-risk behavior and reckless behavior can be likened to speeding on the highway. Most people drive 5 to 10 miles over the speed limit on a limited access highway; this is an at-risk behavior that has become the norm. Most drivers feel safe at this speed and can justify the behavior on the basis of the social utility of getting to a destination more quickly. Most drivers would agree, however, that driving 95 miles per hour where the speed limit is 65 is reckless behavior. The risk is substantial and known, and the social utility of arriving quickly at a destination is no longer justified when weighed against the risk of an accident.

Healthcare providers very rarely engage in reckless behavior. Such behaviors are categorized as reckless when the worker realizes the possibility of harm, whether or not harm was intended or results.3 Workers under the influence of alcohol or illegal drugs may know that harm is possible but not intend it. In other cases, such as patient homicide, harm is intended. A pharmacist who is too vain to wear reading glasses while entering medication orders also may be exhibiting reckless behavior if she knows the risk she is taking, knows that the risk is substantial, and then consciously disregards the risk, which is unjustifiable. Most workers would not think vanity is a justifiable reason for not wearing reading glasses while entering medication orders. Reckless behavior is blameworthy behavior. As such, it should be managed through remedial or disciplinary actions according to the organization’s human resources policies.3

A promising road ahead. A Just Culture includes not only a robust accountability model, fair to all stakeholders, but a model for addressing system and behavioral risks both before and after events occur.6 Managers coach workers on the risks posed by behaviors they observe daily. They seek to eliminate the incentives for at-risk behaviors before patients are harmed. They engage the workforce in uncovering and repairing system design flaws before human errors occur. And when an error occurs, the most important question is not necessarily how to handle the involved workers but what can be done to avoid the next error.7

External influences can make it difficult for healthcare systems to maintain a Just Culture. When a patient is harmed, licensing bodies, the legal system, the patient’s family, and the media tend to deal with the involved workers in a punitive manner. However, some licensing and regulatory bodies have joined statewide initiatives supporting a Just Culture. For example, the Minnesota Hospital Association, Minnesota Alliance for Patient Safety, state department of health, and state licensing boards for medicine, pharmacy, and nursing have joined with hospitals to create and uphold a Just Culture.8 The idea has spread to other states, including North Carolina, Connecticut, Hawaii, and Kentucky. These collaboratives have grown into a Just Culture community where regulators and organizations from various industries, including healthcare, come together to learn, share experiences, and use a common set of tools. Such efforts hold promise for broader adoption of a Just Culture. Although the road will not be easy, it is an all-important step; so much else in patient safety depends upon it.

References:

  1. Marx D. Patient safety and the “just culture”: a primer for health care executives. April 17, 2001. Prepared for Columbia University under a grant provided by the National Heart, Lung, and Blood Institute.
  2. Outcome Engineering. The just culture algorithm-version 1.0. Dallas, Tex: Outcome Engineering, Inc; March 2005.
  3. Outcome Engineering. An introduction to just culture. Dallas, Tex: Outcome Engineering, Inc; 2005.
  4. Reason J. Managing the Risks of Organizational Accidents. Hants, England: Ashgate Publishing Limited, 1997.
  5. Marx D, Comden SC, Sexhus Z, eds. Repetitive at-risk behavior—what to do when everyone is doing it. Just Culture Community News Views. 2005;1(Nov/Dec):5–6.
  6. Marx D, Comden SC, Sexhus Z. Our inaugural issue—in recognition of a growing community. Just Culture Community. 2005;1(Nov/Dec):1.
  7. Marx D, Comden SC, Sexhus Z, eds. Using the JC algorithm: a practice session. Just Culture CommunityNews and Views. 2006;2(Jan/Feb):4-6.
  8. Marx D, Comden SC, Sexhus Z, eds. The Minnesota journey—an interview with Alison Page, MS, MHA, VP of patient safety, Fairview Health Services. Just Culture Community News Views. 2005;1(Nov/Dec):2–4.
  9.  GAIN Working Group E, Flight Ops/ATC Ops Safety Information Sharing. A roadmap to a just culture:enhancing the safety of the environment. September 2004.