Opinion

Organizational accountability in medical errors

Grapple with this thought: You’re riding your bike, attempting to master a new trick, when things take an unexpected turn. Instead of nailing it, you break your arm, requiring an immediate visit to the emergency department. The orthopedic resident, who has seen many broken arms that night, assesses your injury and orders morphine for administration. However, you are severely allergic to this drug, and its administration sends your body into shock. What started as a visit for a broken arm has escalated to anaphylaxis care.

So, with whom does responsibility lie? Should the provider have inquired about your allergy? Would it have been justifiable if your allergy condition was extremely rare? What if the outcome of the medication administration had been different?

On any given day, providers find themselves grappling with the pressures of providing quality care amidst the relentless demands of a short-staffed day. With workarounds hastily improvised, a continuing disproportion in supply and demand, and inadequate systems implemented to support daily activities, it becomes increasingly likely that errors may occur.

Far too often, providers find themselves at the forefront of accountability, facing repercussions for a sequence of events that result in patient harm.

“Every system is perfectly designed to get the results it gets. If we want different results, we must change the system.” – W. Edwards Deming

While it’s tempting to focus solely on individual behaviour and choices, various factors must be evaluated to assess the appropriate level of accountability. Within the principles of Just Culture, the evaluation of patient-safety incidents goes beyond individual blame and delves into the comprehensive context of the system, the specific situation, and the critical components that measure the level of support and necessary improvements. By evaluating system considerations, we can associate individual behaviours with preventable measures that can be implemented to mitigate safety incidents.

Providers are fallible and, like everyone else, make human errors. However, the lack of a conscious decision to act or not act, often known as “auto-pilot” mode, can be halted when systems are designed to do so.

Consider the case of RaDonda Vaught, a reassigned nurse who was convicted of negligent homicide for administering the wrong medication to her patient. Medication administration is one of the many duties that are completed countless times a day, and when systems are not designed to support a safety check, errors occur. Interventions such as barcode scanning act as a safeguard against predictable errors, effectively interrupting the automatic mode of operation and requiring a conscious effort to verify the information presented by the system. This is one of many cases that underscores the accountability of organizations to implement systems that support daily workflows, especially in environments where supply seldom meets demand.

Could well-designed learning systems and infrastructure have facilitated the nurse in making the right decision?

Providers also have a duty to produce an outcome, a duty to follow protocol, and a duty to avoid harm/risk. The importance of implementing adequate resources, learning systems and standardized protocols can support behaviours and choices. Consider a situation in which a nurse, grappling with a heavy workload and numerous patients, must decide whether to independently transfer a non-mobile patient or wait for assistance, as per protocol. In the process of transferring the patient on their own, the patient slips from the nurse’s grasp, leading to a broken hip. In the heat of the moment, the nurse believed the decision to transfer the patient was justified, having conducted a mental risk assessment and prioritized the immediate transfer to make room for the next patient. However, could well-designed learning systems and infrastructure have facilitated the nurse in making the right decision? Access to proper equipment for safe transfers, additional support, education on patient handling and other resources could have made it easy for the provider to make the right choice.

The above instances are not highlighted to excuse the role of providers in delivering quality care; negligence does occur, and there are instances of blatant disregard for risk that must be addressed accordingly. However, when assessing incidents, it is also crucial to look beyond individual blame and evaluate the interplay of the situation at hand, the learning systems and the system designs that contribute to the sequence of events leading to an incident.

Let’s revisit the initial scenario of the patient receiving morphine. From an individualistic perspective, you first would have placed blame on those within the care team. They should’ve asked the patient about allergies, and their lack of attention resulted in harm. You are right. However, a bad system will beat a good person every time. A holistic approach would also evaluate the following: (1) implementation of triggers to prompt providers about allergies; (2) visual cues indicating a patient with an allergy (e.g., a red allergy bracelet); (3) promoting patient ownership and educating patients about the importance of asking the right questions and being actively involved in their care.

Advocating for the protection of health-care professionals when they make mistakes is not about absolving them of responsibility. It’s about recognizing that our system is complex; that even health-care workers are fallible; and that mistakes are rooted in system-wide issues.

It’s time we humanized health care by creating an environment of psychological safety in which organizations not only report and discuss errors but view them as opportunities to make tangible change.

“It is tempting, even reassuring, to think that this conduct was so different, so otherworldly, so divorced from the ecosystem in which we practice that system safety could be restored with the removal of a single provider.” – Barbara Olsen

Leave a Comment

Your email address will not be published. Required fields are marked *

1 Comment
  • Mike Fraumeni says:

    Excellent and well written piece highlighting the need for health care workers and administration to be able to report and discuss errors as a learning opportunity to provide excellence in health care. Also of interest is diagnostic errors and of particular concern are medically unexplained symptoms that can be brushed off as psychogenic in nature. eg:

    “As a matter of peculiar pro­fessional fact, there is no term that names diagnostic uncertainty without also naming psychological diagnosis,” bioethicist Diane O’Leary and health psychologist Keith Geraghty state in the Oxford Handbook of Psychotherapy Ethics.”
    Source: Reymeyer, Julie. Not in Your Head – Doctors sometimes dismiss physically sick patients with psychiatric diagnoses, entering errors into medical records that impede real treatment for years. Here’s how to protect yourself. Sept. 6, 2023. OpenMind – https://www.openmindmag.org/articles/its-not-in-your-head

Authors

Pamela Bader

Contributor

Pamela Bader is a first year MHA student at the University of Ottawa’s Telfer Program and Corporate Project Manager at the Queensway Carleton Hospital, overseeing projects in both the clinical and non-clinical realms.

Imène Tissoukai

Contributor

Imène Tissoukai is an MHA student at Telfer School of Management in Ottawa and Program Coordinator at Bruyere, balancing academic pursuits with a role in coordinating programs at the forefront of healthcare.

Republish this article

Republish this article on your website under the creative commons licence.

Learn more