From the July 28, 2005
Healthcare is a complex, error-prone industry. However, other complex, error-prone industries, such as aviation and nuclear arms handling, have much better safety records than healthcare. The fundamental differences between these industries, known as high-reliability organizations (HROs), and healthcare are deeply rooted in their cultures. While an exact definition of a culture of safety is still emerging in healthcare, recurrent themes in HROs offer an opportunity to reflect on how our respective cultures may differ. Part I, in our July 14, 2005, issue presented the first six recurrent themes: strategic emphasis on safety, consistent message from managers, just culture, feedback loops, learning organization, and desire to change. Part II presents the remaining themes.
Teamwork. In HROs, teams comprising multiple disciplines and levels of workers meet regularly to plan, deliberate, communicate, and evaluate their work. The rigid hierarchal structures that make it difficult for people to raise concerns and voice opinions, regardless of rank or education, have been erased so there’s a real collaboration to seek solutions to problems. The teams are highly functional; they’re not just proficient in technical skills, they’re also adept at avoiding errors caused by failures in communication and decision making in a dynamic environment.
Effective teamwork doesn’t come naturally; thus, in HROs, those expected to work in teams are trained in teams via simulation, so the education is directly linked to the desired outcome. For example, in aviation, effective team communication is taught through simulations known as crew resource management.(3) Aviation crew members spend many days together practicing communication and inquiry skills–how to ask questions, ways to seek relevant information, how to advocate for safety and counteract intimidation, how to resolve conflicts, and how to make team decisions. During the simulations, they focus on key principles of teamwork: understanding each other’s role and responsibilities, monitoring team performance, anticipating team members’ needs, adjusting to team members’ actions, adapting to changing environments, considering alternative solutions provided by teammates, identifying mistakes and lapses in team members’ actions, and recovering to correct for these errors.(3,5)
Localized decision making. Because HROs have effective teams, they are able to shift the burden of decisions from top leaders to a more localized decision-making model. First, every team member feels empowered to make quick decisions regarding impending safety issues. On the other hand, workers understand that collaboration with other team members, when time permits, results in higher quality decisions. Decisions made by teams can also lead to innovation that can’t be realized without such collaborative interaction.(2,4)
Carefully selected rules. HROs live by a minimalist approach to rules.(2,4) They offer standard operating procedures to reduce variation. However, when it comes to rules, they provide a minimum set so workers have enough guidance on high priority issues, but they are not obligated to stop thinking and simply apply rules. HROs do not want to stifle creativity or learning; thus, they make workers think critically about the work they are doing, instead of simply applying rules.
Systems that defy error. HROs design systems that defy errors, avoiding reliance on human memory and vigilance. They carefully consider human factors, attending to unsafe conditions such as long working hours, excessive workloads, unsafe staffing ratios, sources of distraction, and other environmental conditions known to contribute to errors. HROs do not rely solely on worker education to ensure that systems function properly; instead, they employ technology and proven principles of error reduction– forcing functions, failsafes, standardization, simplification, constraints, and so on–that have demonstrated sustained safety improvement over time.
Redundancies and recovery plan. One hallmark of HROs is their high level of redundancies in both safety measures and personnel for the most critical processes.(1) This means extra trained staff and equipment on site to duplicate critical functions and, thus, detect errors and intercept them before harm occurs. Assuming errors will happen, HROs also plan for recovery after an error. This means making errors visible and easy to reverse, or making it hard to carry out a non-reversible error.(1) Designing for recovery can be exemplified by attempts to delete a file when using most computer operating system software; a prompt will ask you to confirm that you want to delete the file, and, if you delete the file, it then goes into a recycle/trash bin, where it can still be retrieved if an error has occurred.
Outward and proactive focus. Learning in HROs does not only occur from within, but also by actively seeking outside knowledge to improve safety. Such organizations don’t need an accident as a wake-up call for action. They consistently engage in proactive risk assessment activities, capitalizing on effective hazard reporting programs and using teams with sophisticated knowledge of systems to test and anticipate the ways things could go wrong. Additionally, HROs demonstrate an outward sharing of this knowledge via mentorship, networking, collaborative work, and external reporting systems.
Community involvement. HROs engage the community in their safety efforts. They build positive relationships with the media and respond openly to the public, who may have anxieties about safety within the industry. They are transparent with the public, disclosing information about accidents, errors, safety hazards, and other performance data. Additionally, community members are often asked to serve on safety panels, where their input is greatly valued and respected.
Effective use of safety measurement. HROs know their safety climate and their level of system performance. They can tell if a change has resulted in an improvement, not just through workforce notification of problems, errors, and accidents, but by devoting resources to more reliable and accurate ways of detecting risk, errors, and harm. One could argue that accidents in HROs are more transparent than medical errors. However, HROs do not simply count the number of airline crashes, nuclear meltdowns, or chemical spills; nor do they rely on workforce notification of problems. Instead, they have standard surveillance plans that help them systematically uncover risk, errors, and less than excellent outcomes. Tracking their performance over time, HROs typically have reliable outcome data, which can be used for benchmarking when linked with the processes that produced them.
Conclusion. As noted in the 2001 evidence report prepared for the Agency for Healthcare Research and Quality, “Promoting a culture of safety remains surprisingly unexplored in healthcare settings, where the risk of error is high.”(3) Culture change occurs slowly, but today, we can set the stage for this necessary paradigm shift. The benefits to patient safety are vast, but don’t overlook the positive impact on the workforce, too. Employees who work in HROs with a strong safety culture consistently report that the following values are clearly present and visible in their organization:(3,6)
--Freedom for creativity and innovation
--Pride in achieving goals
--Strong feelings of credibility and interpersonal trust
--Resiliency in the wake of problems
--Helpful and supportive coworkers
--Friendly, open working relationships
--Strong sense of interpersonal responsibility.
The idiom, “Culture eats strategy for lunch every day” is not without basis and should be heeded by the healthcare industry.
References (for Part I and Part II):
(1) Kohn L, Corrigan J, Donaldson M, eds: To err is human: building a safer health system. Washington, DC: Committee on Quality of Health Care in America, Institute of Medicine. National Academy Press; 2000.
(2) Senge P, Kleiner A, Roberts C, et al. The dance of change. 1999; Currency:NY, NY.
(3) Shojania KG, Duncan WB, McDonald KM, et al. Making healthcare safer: a critical analysis of patient safety practices. Evidence report/technology assessment #43 (Prepared by University of California at San Francisco-Stanford University under contract #290-97-0013), AHRQ publication #01-E058, Rockville, MD: Agency for Healthcare Research and Quality. July 2001.
(4) Senge P, Kleiner A, Roberts C, et al. The fifth discipline fieldbook. 1994; Currency: NY, NY.
(5) Baker DP, Salas E, King H, Battles J, et al. The role of teamwork in professional education of physicians: current status and assessment recommendations. J of Qual and Safety. 2005;31(4):185-202.
(6) Roberts KH. Cultural characteristics of reliability enhancing organizations. J of Managerial Issues. 1993;5:165-81.