Featured Articles

Pump Up the Volume: Tips for Increasing Error Reporting and Decreasing Patient Harm

Error-reporting systems continue to be an important tool for improving patient safety and often represent one of the primary means by which healthcare providers learn about:

  • Potential risks: hazardous conditions hidden in systems, processes, or equipment

  • Actual errors: errors and close calls that occur during the delivery of patient care

  • Causes of errors: underlying weaknesses in systems, processes, or equipment that explain why an error happened

  • Error prevention: ways to prevent recurring events and, ultimately, patient harm

Error-reporting systems can identify local system hazards, foster a culture of open communication, promote the concept that each staff member is an important contributor to safety, share lessons learned within and across organizations, and provide an initial record of an adverse event.1 However, even today, error-reporting systems are not used to their full potential, largely due to staff underreporting and lack of meaningful analysis and change in response to error reports. This article addresses the reasons for staff underreporting as well as tips for increasing the frequency and value of reporting. Later in 2021, we will publish a follow-up article, in this newsletter, that will address meaningful analysis and change in response to error reports. 

Barriers to Error Reporting

While error reporting (including close calls) is a fundamental component of a safety culture, encouraging healthcare workers to submit reports is no easy task given the potential disincentives to reporting. First, reactions to making errors vary, but candid confessions of mistakes are not particularly comfortable. In fact, people have a natural desire to forget that the incident ever happened. Even if healthcare workers are willing to speak up about errors, they may still believe that the extra work is not worth their time if they perceive no benefit will come from reporting, especially if they experience error fatigue due to inevitable and recurring errors that seem to never be addressed. They may be even less likely to report if the reporting process is time consuming, confusing, or complex.

Second, healthcare workers may not consider reporting to be a priority, especially if the error was captured and corrected before it reached a patient, as with close calls. Close calls may be seen as “unworthy of reporting” since they did not cause patient harm or they may be thought of as a “one-time event” and do not need to be reported.2 However, the odds of reporting a close call are higher if the error was caught later in the process (closer to the patient), was considered a system vulnerability rather than a sign of system resilience, and was felt to be an event that “nearly happened” rather than “could have happened.”3,4 Thus, the willingness to report a close call seems to be related to a strong outcome bias and how close the event came to harming the patient.

Finally, the likelihood of reporting is highly dependent on the degree of psychological safety felt by healthcare workers. The workforce is understandably reluctant to report errors if they are worried that the information will get them or their colleagues in trouble, legally or socially, impact their job or working relationships with others, or lead to the perception of being careless, incompetent, or an informant. Consider the following example of a nurse who was reluctant to report a dosing error with verapamil to the charge nurse.

A nurse misunderstood an order for a bolus dose of intravenous (IV) verapamil 5 mg followed by a continuous infusion of 5 mg/hour for a step-down unit patient who suddenly developed atrial fibrillation and tachycardia. For the bolus dose, the nurse removed two vials of verapamil from an automated dispensing cabinet that clearly noted the strength on each vial as “5 mg per 2 mL.” She confused the “2” in “2 mL” to mean that she should administer “2 vials” to equal the prescribed 5 mg dose. She administered both vials of verapamil—10 mg, or twice the prescribed bolus dose—and immediately recognized her error.

When the patient’s physician suddenly appeared on the unit, the nurse was comfortable telling the physician about the error, but she spoke in a hushed tone. The nurse then added that she would have to tell him the rest of the details after the charge nurse moved out of earshot. The verapamil continuous infusion was prepared by   the pharmacy and was started 15 minutes later. Luckily, the patient, who was already on telemetry, showed no signs of toxicity over the next several hours.

Despite encouragement from the physician, the error was never reported within the facility. Thus, the opportunity for other clinicians and managers to learn from this mistake was lost because something––perhaps fear of reprisal––prevented this nurse from reporting the error or involving her charge nurse after she made an error.

Tips to Increase Error Reporting

Regardless of the potential disincentives to report, some highly functional internal and external error-reporting systems exist today, including the practitioner-based ISMP National Medication Errors Reporting Program (ISMP MERP) and the ISMP National Vaccine Errors Reporting Program (ISMP VERP). From these, best practices that promote active error reporting and opportunities for shared learning can be identified. These best practices fall into the following nine categories that impact the quantity and quality of reports (also see Table 1).

Trustworthiness. Those who receive and act on error reports must earn the trust of reporters and prove that the program is sensitive to reporters’ concerns, particularly fear of punishment or undue embarrassment for making and reporting errors. Feelings of trust are fostered by leaders who demonstrate an unequivocal passion for safety, acknowledge the high-risk nature of healthcare and human fallibility, and use reports of errors and close calls to assess system performance, not staff performance.

Open, fair, and learning culture. Leaders who act on error reports must create a just approach to assessing and responding to errors and events, fostering learning, and gaining staff trust and participation in improving patient safety. They must create an environment of internal transparency around risk, promptly identifying system hazards, equipment, and behavioral risks that could cause harm, sharing error reports for the purpose of learning, and using data (e.g., data from technology, monitoring of triggers), not error reports, to measure risk. Ideally, what is needed is a Just Culture in which workers thrive and are encouraged to provide essential safety information without fear of being judged, treated unfairly in the wake of an error, or worried about error rates.5

Confidential. Those who receive reports must keep confidential the identity of the reporter, healthcare workers involved in the error, and the location of the event to prevent undue embarrassment or undesirable attention. However, anonymity when reporting is not recommended, as those who receive the report would not be able to talk to the reporter or others involved in an error to learn about the causative factors. Anonymity also signals to reporters that it may not be safe to provide their identity or location, which undermines the idea of trustworthiness. Removing identities after the error has been fully investigated is an option to maintain confidentiality.     

Clear. Healthcare workers should be provided with clear definitions and multiple examples of the types of errors, close calls, and hazards, including concerns workers may have about their environment, technology, processes, and patient safety, that should be reported. Be clear with workers about what information and descriptions should be included in the free-text narrative section of the report so a few words, one sentence, or incomplete reports are not the norm.

Easy. Reporting mechanisms should be exceedingly easy, readily accessible, and require minimal training. Those who receive reports must pay attention to the format and length of the required report. If the report is too long, it will stifle reporting; if the report is too short, there may not be enough information to make it useful. Instead of asking the reporter broad, general questions, the report should prompt for key identifying information and a free-text description of the event. While a narrative description is often the most useful information in the report, you might also consider asking questions that are specific to the type of event (e.g., for medication errors, the name/dose of the drug[s] involved; for falls, the location of the patient at the time of the fall; for medical devices, the specific make/model of the device) to prompt for the most pertinent information about the event.

When investigating or following up on a close call or error, a reporting tool could help identify missing information about the patient or drug, communication problems, labeling and packaging problems, drug storage problems, environmental problems, and so on (for a sample tool, click here). Probing questions shift a lot of the analytical work away from the reporter and make it easier for the investigator to uncover some of the causative factors that led to the error.

Do not place too much emphasis on frontline workers completing the entire reporting form—key information and a narrative description should be the minimal requirements. A patient/medication safety officer familiar with the reporting system should further investigate events that have merit. Event reporting mechanisms should also be flexible enough to include both formal and informal ways of accepting streamlined information, including oral, written, and electronic submissions.

Credible and useful. Few things impede reporting more than perceived inaction and failure to use the information contained in a report to improve safety. Unfortunately, most reported problems are not acknowledged or addressed, let alone remediated, and workers often do not perceive error reporting as a good use of their time. Additionally, analyses of reported events are often superficial and do not result in meaningful change.1

Leaders must devote the necessary resources to not only collect reports, but also to analyze reported events and mitigate exposed risks through the effective stewardship of resources. Furthermore, those who receive reports must provide rapid, useful, and understandable feedback to healthcare workers, across departmental lines, keeping them informed about how their reports are being used to improve systems and processes, even if only to thank the reporter and let them know the event is being investigated. If staff observe changes based upon their reports and feedback, they will be more willing to take the time to report hazards and errors.

Rewarding. While not as satisfying as knowing that a report resulted in system-level action, occasional recognition for playing a positive role in patient safety through reporting should be acknowledged by those who receive reports and other organizational leaders.

No severity bias. While a prioritization hierarchy associated with harmful or potentially harmful events may be appropriate for more thorough analysis, those who receive reports and organizational leaders should not allow the severity of the outcome or patient harm drive the response to the report. Not allowing the severity of the outcome to influence decisions helps uphold a commitment to: a) avoid unwarranted punishment of human error or at-risk behavior by overreacting to a singular event, and b) address a potentially fatal system design flaw or reckless conduct, despite the fact that the patient was not harmed. 

Reinforced imperative. Those who receive reports must establish mechanisms for mentoring new and existing staff about the error-reporting process, stressing the importance of reporting hazards, close calls, and errors by including clear expectations for reporting activities in all job descriptions and during performance evaluations.

Conclusion

By following the tips provided above and in Table 1, organizations can optimize reporting and their capacity for learning about the human, technical, organizational, and environmental factors that determine the safety of the system as a whole. While pumping up the volume of reporting is an admirable goal, do not become too focused on the gathering of error reports. The ultimate measure of success for error-reporting programs is not the number of reports received but rather the learning that occurs and the amount of patient harm prevented as a result of system changes prompted by the reports. While it may be difficult to measure risk avoidance and a reduction in patient harm, a reasonable alternative is measuring the number of system changes made as a result of the error-reporting system.1 Look for a feature article later in 2021 about how to aggregate and prioritize reported events and investigate them thoroughly so meaningful system changes can be implemented and measured. 

Table 1. Best practices that encourage error reporting

Trustworthiness
  • Patient safety is clearly reflected in the organization’s mission, vision, values, and strategic goals.
  • Leaders’ decisions demonstrate a visible and unequivocal passion for safety and the prevention of patient harm.
  • Leaders acknowledge the high-risk nature of healthcare and human fallibility.
  • Leaders are visible in work areas to learn firsthand about the barriers to safe care and to make themselves available for discussions about patient safety.
  • Leaders share responsibility for errors when they occur.
Open, fair, and learning culture
  • Leaders treat all workers fairly and equitably when responding to an adverse patient safety event.
  • Leaders do not discipline individuals who report or commit human errors or at-risk behaviors; disciplinary sanctions are reserved for reckless conduct, knowingly causing unjustifiable harm, and purposely causing harm.
  • Leaders utilize errors to assess system performance, not staff performance.
  • Leaders openly discuss hazards, close calls, and adverse events, along with the lessons learned and recommended risk-reduction strategies.
  • Leaders encourage providers and staff to report hazards and precursors to harm so they can mitigate risks before harm occurs.
  • Leaders use reports of errors and hazards outside the organization to make proactive system changes to reduce the risk of similar errors within the organization.
Confidential
  • Confidentiality is guaranteed for reporters, individuals involved in errors, location of events, and patient identity.
Clear
  • Staff are provided with clear definitions and multiple examples of the types of errors, close calls, and hazards that should be reported.
  • The error-reporting process (with examples) is covered during orientation for all providers and staff.
Easy
  • Providers and staff have an easy method(s), including informal pathways, for reporting hazards, close calls, and errors.
  • The reporting system is so simple that it can be used with minimal training.
  • The format used to collect information about events is tested for clarity and ease of use, and edited as needed before or after implementation.
Credible and useful
  • Leaders have developed guidelines to identify and prioritize events for which conducting a thorough investigation and/or a root cause analysis (RCA) is appropriate and useful.
  • Pathways have been established for sharing the lessons learned from error analysis and RCA (e.g., storyboards, newsletters, staff meetings, educational presentations, daily safety huddles).
  • Leaders act upon error and hazard reports by fixing system vulnerabilities, rather than punishing individuals.
  • Leaders support system enhancements suggested by staff to reduce the risk of harmful errors.
  • Leaders empower staff to correct safety hazards (in conjunction with appropriate communication with leadership).
  • Leaders consistently provide feedback to staff regarding the actions planned and taken to prevent errors.
  • Pathways have been established for meaningful cross-departmental sharing of memorable error stories and error-reduction strategies.
  • Pathways have been established to share meaningful data to demonstrate safety problems and ensure that actions taken have been successful in reducing risk, error, and/or patient harm.
  • External reporting is encouraged so that patient safety organizations can disseminate useful information to others and work to address problems at the regulatory, standards, and industry levels.
Rewarding
  • Pathways have been established for thanking and rewarding staff who report errors or hazards, and for patient care units for demonstrating measurable improvements in patient safety.
  • Demonstrable results and actions taken by the organization based upon the information received in reports are made evident, shared, and celebrated.
No severity bias
  • Leaders do not overreact to a singular event with unwarranted disciplinary sanctions even when a patient is harmed.
  • The severity of harm from an adverse event does not determine whether leaders address a patient safety event.
  • Leaders do not overlook repetitive patient safety problems because patients have not yet been harmed.
Reinforced imperative
  • New providers and staff are assigned a mentor to assist with the error-reporting process.
  • New providers and staff are required to report at least one safety hazard during their orientation period.
  • Participation in error, close call, and hazard reporting is included as core elements in all staff members’ job descriptions and performance evaluations.

References

  1. Pham JC, Girard T, Pronovost PJ. What to do with healthcare incident reporting systems. J Public Health Res. 2013;2(3):e27.
  2. Hewitt TA, Chreim S. Fix and forget or fix and report: a qualitative study of tensions at the front line of incident reporting. BMJ Qual Saf. 2015;24(5):303-10.
  3. Jung OS, Kundu P, Edmondson AC, et al. Resilience vs. vulnerability: psychological safety and reporting of near misses with varying proximity to harm in radiation oncology. Jt Comm J Qual Patient Saf. 2021;47(1):15-22.
  4. Institute for Safe Medication Practices (ISMP). Close calls—a sign of resilience or vulnerability? Odds are higher that vulnerabilities are reported. ISMP Medication Safety Alert! Acute Care. 2021;26(13):2-3.  
  5. Institute for Safe Medication Practices (ISMP). The differences between human error, at-risk behavior, and reckless behavior are key to a Just Culture. ISMP Medication Safety Alert! Acute Care. 2020;25(12):1-5.