Safety requires a state of "mindfulness" (Part I)
From the March 9, 2006 issue
To truly impact patient safety, many healthcare organizations are attempting to adopt the characteristics of high-reliability organizations (HROs) that have sustained impressive safety records despite operating in unforgiving and complex environments. A few examples of HROs include nuclear power plants, air traffic control systems, naval aircraft carriers, and wildland firefighting crews. As diverse as these HROs seem, they all operate similarly within a culture of safety that can be defined by commonalities, many of which were described in our July 14 (www.ismp.org/Newsletters/acutecare/articles/20050714.asp) and our July 28 (www.ismp.org/Newsletters/acutecare/articles/20050728.asp) newsletters.
One characteristic of HROs that has received less attention than deserved involves a vital set of cognitive processes used to continuously discover and correct errors, especially under adverse conditions. Collectively termed mindfulness, this defining characteristic is driven by a deep, chronic sense of unease in HROs that arises from admitting the possibility of failure even with familiar, well-designed, stable processes. (1,2) People in HROs expect surprises and consider them a valuable resource because they encourage learning and discovery and discourage complacency or inertia.(1) Workers are empowered to act on surprises to achieve reliable outcomes.
This state of mindfulness about safety results from five cognitive processes that capture the essence of HROs:
Here in part I, the first two elements of mindfulness are explained. In our next newsletter, the remaining elements will be explained along with the differences between reliability and repeatability.
- Preoccupation with failure
- Reluctance to simplify interpretations
- Sensitivity to operations
- Commitment to resilience
- Deference to expertise. (1,3,4)
Preoccupation with failure. A chronic worry about system failure is a distinctive attribute in HROs.(1-4) People in HROs are naturally suspicious of "quiet periods" and reluctant to engage in any activities that are not sensitive to the possibility of error.(1) They ask, "What happens when the system fails?" not "What happens if the system fails?"(4) This preoccupation with failure is a rather interesting phenomenon, as actual failures in HROs are indeed a very rare occurrence. With little data about actual failures, learning in HROs is accomplished in three ways: by treating any and all failures or near misses as a symptom of system-wide problems, by thoroughly analyzing actual and potential failures, and by counteracting the complacency of success. (1,3)
To increase the flow of information available for learning, HROs encourage and reward error and near miss reporting. They clearly recognize that the value of remaining fully informed about safety is far greater than any perceived benefit from disciplinary actions. Landau and Chisholm (1,5) emphasized this point almost two decades ago when describing a seaman on a Navy nuclear aircraft carrier who broke a vital rule; he did not keep track of all his tools while working on the landing deck. He subsequently found one of his tools missing and immediately reported it. All aircraft enroute to the carrier were redirected to other land bases until the tool was found. The next day, the seaman was commended for his disclosure during a formal ceremony–a very different response than one might expect if, for example, reporting a lost sponge after an operative procedure, thus delaying or postponing other scheduled procedures.
HROs perform rich analyses of the information they get. They pay close attention to near misses and can clearly see how close they came to a full-blown disaster; less safe organizations consider close calls to be evidence of their successful ability to avoid a disaster.(1) Less safe organizations also tend to localize failures (e.g., the problem is in the ICU, so changes are needed in the ICU). HROs generalize even small failures and consider them a lens to uncover weaknesses in other vulnerable parts of the system. (1,3) HROs also acknowledge that the accumulation of small failures increases the risk of large failures.
Because HROs focus on failures, they avoid many of the dysfunctional temptations that arise from success, such as complacency, overconfidence, and inertia. (3) They do not expect success to breed success, and managers do not attribute success to their own abilities or the organization as a whole. Instead, they are wary of the potential to drift into rote routines during periods of success. Less safe organizations might call this efficiency, but HROs consider this drift a failure because continuous adjustments to changing conditions might not occur.(1,3) While focusing on success might not appear to be a problem, preoccupation with success encourages largely mindless acts such as habitual work habits and overconfidence.(1-4)
Reluctance to simplify interpretations. Organizations typically handle complex issues by simplifying them, thus ignoring certain aspects. HROs, however, attempt to suppress simplification because it limits the ability to envision all possible undesirable effects as well as the precautions necessary to avoid these effects.(1,3,4) They take nothing for granted. Otherwise, every seemingly inconsequential detail ignored can accumulate and come rushing to the forefront as complex problems. (3) Conversely, HROs pay attention to detail and actively seek to know what they don't know.(1) They do not concentrate on things that seem certain, factual, explicit, and agreeable to all. Instead, they attempt to uncover things that might disconfirm their hunches and are unpleasant, uncertain, implicit, and disputed. Workers are socialized to notice more and to strip away stereotypes that conceal differences that may be hidden in the details.
HROs also resist simplification by seeking out different points of view; because differences, not commonalities, hold the key to detecting potential failures. (1-4) Diversity also takes the form of checks and balances, from hiring new employees with varied prior experience to novel redundancies. Most often, redundancies involve duplication of work, but in HROs, redundancies also take the form of healthy skepticism driven by wariness about claimed competencies, and a respectful mindfulness about safety. (1) Skepticism is also deemed necessary to counteract the complacency that many typical redundant systems foster.
Diversity has a potential downside: miscommunication and conflicts among workers with differing views. However, HROs are distinguished not only by their resistance to simplification through diverse viewpoints, but also by the way they manage workers with differing views.(1,3) While diverse groups will have more information upon which to base decisions, HROs understand that failed communications and mistrust can lead to withheld information. Thus, HROs place a high value on interpersonal skills, mutual respect, norms that curb arrogance and self importance, continual negotiation, teamwork, cultivation of credibility, and deference to expertise.(1,2,4) HROs also promote feelings of trust among diverse groups by fostering the belief that humans are fallible, and that skeptics and diversity are necessary to improve reliability.(1)
5) Landau M, Chisholm D. The arrogance of optimism: notes on failure avoidance management. Journal of Contingencies and Crisis Management 1995; 3:67-80.
1) Weick KE, Sutcliffe KM, Obstfeld D. Organizing for high reliability: processes of collective mindfulness. Research in Organizational Behavior 1999; 21:81-123.
2) Reason J. Managing the risks of organizational accidents. Burlington, VT: Ashgate Publishing Company; 2000.
3) Weick KE, Sutcliffe KM. Managing the unexpected: assuring high performance in an age of complexity. San Francisco: Jossey-Bass; 2001.
4) Leonard M, Frankel A, Simmonds T, Vega K. Achieving safe and reliable healthcare: strategies and solutions. Chicago: Health Administration Press; 2004.