What can we learn from the Vaught case? Don’t penalize nurses for being human

PUBLISHED ON

By Dominira Saul

The conviction of a nurse for manslaughter in Tennessee in March sets an uncomfortable precedent for healthcare workers. Professional licensing boards generally handle medical errors; criminal prosecutions like RaDonda Vaught’s case are rare. Vaught’s case also raises pointed questions about the systems and equipment used to support patient care.

The bigger question is, are nurses set up for failure? The cognitive load on a nurse is immense. In the workplace, they experience desensitization, alarm fatigue and repetitive procedures. Healthcare systems, devices and care models must be designed with this in mind. In medtech, attention to usability can save lives.

We have recently worked with a national healthcare network in the US that operates hospitals, emergency rooms and clinics in 20 states, and have experience with numerous other healthcare organizations. In a medical setting, where errors can be fatal, it’s clear the system of checks and balances needs to be better. Medtech systems need to be designed with an understanding of how nurses work and their workflow. Nurses are bombarded with information during a shift, but current record-keeping and communication systems aren’t purpose-built for nurses; they are imperfect adaptations modified to work in this setting.

Nurse convicted of homicide

In 2017, RaDonda Vaught, a former nurse at Vanderbilt University Medical Center, mistakenly administered a powerful paralytic instead of a sedative with a similar name. The patient, a 75-year-old woman, died. Vaught admitted the error and acknowledged there were warning signs that she missed during the incident. She was cleared by the Tennessee Board of Nursing but criminally charged by local authorities.

Vaught was convicted of negligent homicide and gross neglect of an impaired adult on March 25, 2022. She received a sentence of three years probation in May.

According to local news reports, her former employer has not faced punishment and has settled with the family.

Vaught’s attorney said the nurse made an error and faulted the medication dispensing unit. Prosecutors argued that Vaught overlooked warnings that it was the wrong drug and failed to monitor her patient.

Some of the usability questions raised by the case are: What is a deadly drug doing in the dispensing unit? What human factors caused the error? Why is the reader at fault and not the labels on drawers and pharmaceuticals?

In her testimony to the Board of Nurses, Vaught mentioned desensitization and alarm fatigue. Regarding the warnings on the vial of medication and the drawer of the dispensing unit, she said she did not acknowledge them. She also said there is a tendency to ignore alarms and warnings because they are frequent and can be false or inappropriate. “There are a lot of drawers with warnings; it’s a critical care area,” Vaught noted.

This is the working environment designers must understand to design effectively for healthcare professionals. Those of us who have researched medtech know that drug labelling and patient record software play a role in clinical usability.

QuotationMarks.svg

Dr. Daniela J. Lamas, writing for the New York Times about the Vaught case, said, “Mistakes happen, even to the most vigilant, particularly when we are juggling multiple high-stress tasks. That is why we need robust systems, to make sure the inevitable human errors and missteps are caught before they result in patient harm.”

She explains that some hospitals have implemented extra levels of approval before administering meds and no-talk zones where nurses withdraw medications because the task requires focus.

We have known for 20 years that systems within care settings need to account for human error. The Institute of Medicine’s groundbreaking report on preventable medical errors, To Err is Human, argued that errors are due not solely to individual health care providers but also to systems that need to be made safer. The authors also stated that blaming an individual does little to make the system safer or prevent someone else from committing the same error.

QuotationMarks.svg

With this latest case, blaming the individual has gone one step further, to the criminalization of medical error. Should a human error be treated as a criminal offence if we know the appropriate systems are not there to support critical decisions made under duress?

Currently, nursing roles are often short-staffed, there is more and more information coming at them, and they are under extra pressures from the pandemic. Existing systems could be doing more to prevent errors and ease workloads. Rather than spending time and effort on convicting someone for a mistake, wouldn’t resources be better spent on preventative measures within healthcare systems?

QuotationMarks.svg

Share your thoughts about medical systems and human error in the comments. For more context, check out our projects with large public and private organizations where we rework systems that are failing, without penalizing people for being human.

Related blog posts

The federal government’s slow pace in modernizing outdated IT systems is a result of poor oversight, a lack of relevant expertise and a culture that prioritizes budgets over functionality.
The Treasury Board’s web standards specify that all government of Canada web properties need to be accessible, interoperable and usable.
Don’t put touchscreen technology in the driver’s seat just because it’s trendy – Designers of autonomous and electric vehicles need to pay more attention to what works best for drivers.