Hold the Blame

Aug. 23, 2022
Mark Keough explains why if medical errors become criminal, the fire service could be put in a very precarious position.

In December 2017, a nurse at Nashville’s Vanderbilt University Medical Center made a fatal medication error, which resulted in criminal charges. Five years later (because of delays and COVID), RaDonda Vaught was found guilty of criminally negligent homicide and gross neglect of an impaired adult. Mercifully, Vaught was sentenced to three years of probation. Various reports cited a potential maximum sentence of four to eight years in prison. She lost her nursing license.

If the legal system decides to take aim at the fire service, could the same thing happen to prehospital providers?

Deaths from medication errors

According to research, tens of thousands of patients die of in-hospital medication errors annually. A 2016 study by Johns Hopkins Hospital collected data from numerous U.S. studies that were related to medication errors and medical care done poorly. The results of the study estimated at least 250,000 U.S. deaths per year from these two circumstances. If the numbers are close to accurate, medical errors by healthcare providers are the third-leading cause of death in the United States behind heart disease and cancer.

I was shocked when I considered this study, and I wondered whether anything can be done to make patient care in hospitals safer. Furthermore, do we have a similar problem in the prehospital care environment?

The importance of context

In the United States, prosecution of healthcare providers normally is reserved for those who are suspected of knowingly and willfully causing harm. Vaught promptly assumed responsibility for her error. From court documents, the decision of the district attorney general for the 20th Judicial District to prosecute was based on Vaught ignoring warning labels on the medication packaging; not noticing that the medication was different than what was ordered; and failing to scan the medication against the medical identification bracelet that was worn by the patient. This was believed to be criminal behavior, and charges were brought against Vaught.

The patient in this case was a 75-year-old woman. She accidentally was dosed with vecuronium instead of Versed prior to receiving a discharge PET scan. The vecuronium was administered by Vaught. Vaught had two years’ experience as a nurse when the error occurred.

I find two unsettling factors about this legal action.

First and foremost is the fear of legal reprisal and its effect on any organization’s learning culture. This type of action by legal entities naturally affects any organization’s front-line people and their future desire to report. How can a system improve if it can’t learn when or how someone made a mistake given the system that they were working within?

Human failure is normal. It happens in both successful operations as well as unsuccessful operations. Learning from operations, good and bad, can be the bread and butter of building resilient systems. If a retributive threat exists, externally or internally, learning takes a back seat to survival.

Second, if the justice system criminalizes human error in the management of patient care, will it eventually criminalize CEOs’ efficiency decisions that contribute to hurried patient care or unrealistic caregiver production goals? Will it someday criminalize a fire chief’s decision to operate with three personnel versus four? Finally—and this is a big one—who gets to draw the line between what’s determined to be criminal and what isn’t?

Without understanding the context of the goal conflicts that Vaught experienced, review of this case easily succumbs to the seductive nature of blame. Blaming Vaught gives the illusion that the hospital system still is a safe place to receive treatment and healing—hence, the get-rid-of-the-bad-apple syndrome. Context has everything to do with the “why” that this event happened. However, by the reports, it didn’t appear to play a role. Vaught became a scapegoat.

Context considerations in this case would include distractions, experience, environment, technical and mechanical systems and, particularly, competing priorities. Context also considers how tightly coupled or connected things are and the competing priorities that front-line rescuers and caregivers manage (usually very successfully) every day. Context provides the complex circumstances that result in human error, and it offers us much of the “why” that something happened, which helps to uncover how work really gets done.

Leadership should be very interested in how the front line conducts work in the field through all of the complexity. Work isn’t done always as it was specifically designed. We might want to learn more about the concept of “work as imagined versus work as it is done.” Our front-line firefighters understand this concept, because they do the work but might not get a chance to design the process. They deal with goal conflicts and system issues every day.

Take a closer look

After adding in some context to this event, see whether you can identify pressures or goal conflicts that Vaught might have been managing when the event took place (from review of press documents and court proceedings).

At the time of the event, Vaught was the “help-all nurse” in the hospital, assisting other departments while providing orientation for a new nurse. She balanced priorities as she was dispatched from location to location to assist or fill in. The hospital trusted her with the orientation of a new nurse.

At the time that the medication error occurred, nurses throughout the Vanderbilt hospital system were encouraged to override the automated pharmaceutical technology to get medications, because the technology wasn’t working as designed. With regularity, nurses used the override feature to keep patient care on schedule. Vaught overrode the system to get Versed, because the system order for the patient wasn’t showing up in the system. After typing in the first two letters—V-E—the machine gave her vecuronium.

Despite the packaging warning labels, the medication was administered in the radiology department to the patient for her anxiety related to the scheduled PET scan procedure. Vaught didn’t scan the medication with a scanner (an added layer of patient protection), because the radiology area where the medication error occurred didn’t have a scanner.

Doctor orders stated that monitoring the patient post-administration wasn’t needed, and a radiology technician took the patient back to the scanner. After administering the drug, Vaught proceeded to the ER to perform another medical procedure with the new nurse.

Months after the patient’s death, the hospital didn’t report the medication error to state or federal officials (required by law) or to their accrediting agency. The hospital didn’t see the medication mix-up as neglect. Two hospital neurologists attributed the patient’s death to a brain bleed and stated on the death certificate that the death was “natural.” It was an anonymous tip 10 months later that alerted officials that a medication error occurred.

By adding context to this story, you might see a different system picture emerge. We might begin to see how system design (poor and good) adds influence on worker behavior more than we care to admit. We might even see how “the system” might be protecting itself.

Technology (a new pharmaceutical dispensing system) ended up with a work around, where nurses gained access to pharmaceuticals that shouldn’t be available as easily as others are.

The distractions of training a new nurse and the production pressure of moving to the next task played a role in worker behavior. Hospital “efficiencies,” such as stretching nurse resources thin, impede safe patient care.

How not to improve

It’s easy to say that Vaught wasn’t paying attention and could have been more careful. It’s easy to agree with hospital leadership and the county attorney and blame Vaught for the fatal error. After all, she gained access to the drug and administered it. However, I strongly caution against this type of response to failure, because it won’t improve patient safety in hospitals nor in prehospital environments.

Criminalizing human error, or “holding people accountable,” as it often is called, won’t encourage the kinds of system improvements that are needed. It will do the opposite. It will create environments of distrust and push needed improvements underground out of fear.

How any fire department leadership responds to failure matters. This response to error demonstrates how not to improve. Neither the fire service nor any other service can’t blame and learn. When fire department leadership welcomes failure as the basis for learning, our front-line people will respond by offering solutions to problems that aren’t imagined right now.

About the Author

Mark Keough

Mark Keough retired after 34 years of service with the Mesa, AZ, Fire and Medical Department. He served as a chief officer for more than15 years in various roles and retired as the deputy chief of operations. Keough teaches in industrial settings and offers investigational course work through the Fire Department Safety Officers Association.

Voice Your Opinion!

To join the conversation, and become an exclusive member of Firehouse, create an account today!