Featured Articles

Ohio Government Plays Whack-A-Mole with Pharmacist

On August 14, 2009, Ohio pharmacist Eric Cropp was sentenced to 6 months in prison, 6 months of home confinement with electronic monitoring, 3 years of probation, 400 hours of community service, a $5,000 fine, and payment of court costs, for his role in a fatal medication error. (Early last week, ISMP President Michael Cohen posted comments regarding the sentencing.) Eric made a human error that tragically led to the death of a child—the fodder of nightmares that plague many health professionals who perpetually fear making that one fatal error. During manual inspection of a compounded chemotherapy solution, Eric failed to recognize that a pharmacy technician had made the base solution using too much 23.4% sodium chloride. The child received the chemotherapy solution and developed severe hypernatremia, which led to her death. 
Human factors research confirms that manual checking systems are not 100% reliable. Under ideal conditions, we—meaning all human beings—fail to perform a check correctly about 5%1,2 of the time, and we fail to detect an error during the checking process between 5%2 and 10%3 of the time. While under moderate stress, our failure to detect an error during an inspection or verification process increases to about 20%.4,5  

According to news media6-8 and personal conversations with Eric’s defense attorneys, conditions under which Eric was working on the day of the event were far from ideal and outside his control:

  • The pharmacy computer system was down in the morning, leading to a backlog of physician orders
  • The pharmacy was short-staffed on the day of the event
  • Pharmacy workload did not allow for normal work or meal breaks
  • The pharmacy technician assigned to the IV area was planning her wedding on the day of the event and, thus, highly distracted
  • A nurse called the pharmacy to request the chemotherapy early, so Eric felt rushed to check the solution so it could be dispensed (although, in reality, the chemotherapy was not needed for several hours).

We don’t have details regarding how verification of IV admixtures occurred in this hospital, but we have observed unsafe variations of the checking process in other hospitals—from a jumble of vials and syringes pulled back to the supposed volume of additives, to vials and syringes from different admixtures together on a cluttered surface awaiting verification. We also know little about why the technician made the compounding error, other than press reports stating she was highly distracted that day. However, we know that compounding a chemotherapy base solution from scratch is error-prone and often unnecessary; such exactness of base solutions is frequently not required from a clinical standpoint.

The price of this medication error was ever so costly: a beautiful 2-year-old child named Emily Jerry lost her life; Emily’s family will forever suffer the pain of her loss; healthcare practitioners who were involved in the error and/or Emily’s care are forever changed by the event; and Eric Cropp, who will never practice again (the Ohio board of pharmacy permanently revoked his license), will forever feel the weight of his human fallibility and how it played out on that fateful day—this while serving an undeserved term of incarceration and other criminal and civil penalties.

David Marx, CEO of Outcome Engineering, likens such a punitive response to human error to a child’s game of Whack-a-Mole. The game is played by lying in wait until a mole (the adverse event) pops up, and then trying to whack the exposed mole with a hammer (to punish the person closest to the event) before it retreats back into the safety of its hole. In his profoundly moving new book on this topic, Whack-a-Mole: The Price We Pay for Expecting Perfection9 (available at Barnes & Noble), Marx notes that this child’s game is a telling depiction of how we set unrealistic expectations of perfection for each other and then unjustly respond to our fellow human beings who inevitably make mistakes. We play the game at work by writing disciplinary policies that literally outlaw human error. Our legislators play the game by writing laws that make human error a felony punishable by prison. We take the easy route with a “no harm (no visible mole), no foul (no whack required)” policy. We turn a blind eye to those imposing unnecessary risk as long as the outcome is good (no mole pops up). But we push our need for perceived “justice” to the point that every harmful adverse outcome must have an accompanying blameworthy person to punish.

According to Marx, the Whack-a-Mole game is simple and addicting: a healthcare professional makes a harmful error and the healthcare system in which he works fires him—whack! The professional licensing board takes his license away—whack! The newspapers and online news media demonize the dedicated professional who has made the mistake—whack! The civil court demands payment from the professional for the bad outcome—whack! The criminal court sends him to jail—whack! Leaders in the healthcare system who employed him stand by silently, without uttering a single word about the system-based causes of the error to help defend the individual—whack-whack! Society is poised to pounce, to swing the hammer when someone is injured. Punish the person most visibly involved in the error and the game is won. Problem solved. Mole whacked. As Marx writes, the “if we all just do our jobs correctly and follow the rules” club tends to view all bad outcomes as blameworthy incidents—even in the presence of poorly designed systems and performance shaping factors outside the control of involved workers; even in the absence of an intent to harm or an evil-meaning mind. 

No matter how hard we try, human endeavors carry inherent risks. We can try to do everything possible to make it safe for patients, but we often fail to plan for the unexpected—a computer system that is nonfunctional when you arrive at work, causing a serious backlog of work; an inadequate level of staff on duty because of unexpected absences; a distracted technician working in a hectic high-risk IV area—just a few of the unexpected conditions in Eric’s case on the day of the event. As Marx notes in his book, civil, criminal, and regulatory systems are increasingly obscuring the differences between intentional, risky choices and inadvertent human fallibility. Thus, the net cast to catch criminals is now catching those whose only crime is that they are human. The criminal courts are playing the most extreme version of Whack-a-Mole with the lives of all healthcare professionals, for who among us cannot say, “It could have been me” when thinking about the plight of Eric Cropp and Emily Jerry?

Marx makes it clear in his book that playing the Whack-a-Mole game costs us dearly, in lives that will continue to be lost due to our failure to learn from mistakes, and in resources that could be put to better use. When we play the game, it does nothing to enable us to learn what we might do differently the next time to avoid a similar tragedy. In fact, ISMP is unaware of steps to help other Ohio hospitals learn from this event and redesign their systems accordingly. We have not heard about any visits by state surveyors to detail expectations regarding prevention strategies in all Ohio hospitals. If nothing has changed in Ohio hospitals, as well as other hospitals in the US, the death of this little girl is a heartbreaking commentary on healthcare’s inability to truly learn from mistakes so we are not destined to repeat them. On a positive note, though, the Ohio legislature passed and implemented Emily’s Law, which requires all pharmacy technicians to be trained, tested, and certified via a state board of pharmacy approved course, as they are in 26 other states.

There is another insidious flip side to the Whack-a-Mole game; it prevents learning by driving errors underground and discourages students from becoming healthcare professionals. Some will ask, “Why disclose errors and risk punishment, loss of a hard-earned license, going to jail?” Thus, some risks will not be addressed to prevent harm. College students may not be drawn to legally “risky” healthcare professions, and professionals working in healthcare may try to avoid risky tasks, such as compounding IV solutions.

Marx makes a compelling argument that the Whack-a-Mole approach is ineffective, inefficient, unsafe, and wholly unjust. There is a better way of dealing with human error and promoting the behavioral choices that best support safety. We spend far too much time reacting to the severity of the outcome and punishing the unfortunate soul closest to the harm, and far too little time addressing the system design that got us to the bad outcome and the behavioral choices that might have contributed to the outcome. A bad outcome should never automatically qualify a practitioner for blame and punishment. We will never be able to design a perfect healthcare system because it is predominantly a human-based system despite our ever-increasing use of technology. Likewise, we cannot and should not expect perfection from each other, no matter how critical the task may be. We are fallible human beings destined to make mistakes along the way, as well as to drift away from safe behaviors as perceptions of risk fade when trying to do more in resource-strapped professions. Our real power to protect patients is in the systems we build around imperfect human beings. 

References:

  1. The Institute of Petroleum. Human reliability analysis. Human factors no. 12 briefing notes. London, England; 2003.
  2. Grasha A. A cognitive systems perspective on human performance in the pharmacy: implications for accuracy, effectiveness and job satisfaction. Executive Summary Report, Report No. 062100. Alexandria, VA: National Association of Chain Drug Stores; Oct. 2000.
  3. Lewis M. THERP: Technique for Human Reliability Analysis. Pittsburgh, PA: University of Pittsburgh; 2002.
  4. System Reliability Center. Technique for human error rate prediction (THERP). Rome, NY: Alion Science and Technology; 2005.
  5. Gertman D, Blackman H, Marble J, et al. The SPAR-H human reliability analysis method. Prepared for The Division of Risk Analysis and Applications, Office of Nuclear Regulatory Research, US Nuclear Regulatory Commission (NRC Job Code W6355); Washington, DC; August 2005.
  6. McKoy K, Brady E. Rx for errors: drug error killed their little girl. USA Today February 2, 2009.
  7. Sangiacomo M. Chris Jerry, whose daughter Emily died from a pharmacy technician’s mistake, starts foundation to push for national law. The Plain Dealer June 13, 2009.
  8.  Atassi L. Former pharmacist Eric Cropp gets 6 months in jail in Emily Jerry’s death from wrong chemotherapy solution. The Plain Dealer August 15, 2009. 
  9. Marx D. Whack-a-Mole; The Price We Pay for Expecting Perfection. Plano, TX: By Your Side Studios; 2009.