Addressing the Greatest Harm: Diagnostic Safety in EMS

by Maia Dorsett, MD PhD FAEMS

When we teach about patient safety and medical errors, we put up pictures of the Swiss cheese model (1, 2) and demonstrate how harm occurs as errors fall through serial holes of imperfect systems.  When I examine this model, in many ways it rings true for the examples it is most often used to illustrate, including medication error.  But I do not think it works as well for the more common errors that are occur every day with greatest impact on patient care: Diagnostic errors.

 What is diagnostic error?  Diagnostic error, defined as โ€œthe failure to establish an accurate and timely explanation of the patientโ€™s health problem or communicate that explanation to the patientโ€ accounts for the greatest proportion of harm to patients in our system.(3) On the surface, this makes sense.  With the exception of emergency stabilizing measures, such as treating respiratory failure with assisted ventilation, the failure to make an accurate diagnosis in a timely manner decreases the chance that treatments provided are tailored to the correct problem.  Diagnostic error is increased in situations of high uncertainty, unfamiliarity with the patient and in conditions of high stress, workload and distraction (4), making the practice of EMS medicine particularly susceptible.

Prehospital clinicians begin with undifferentiated patients and, through an iterative process of information gathering, integration and interpretation, develop a working diagnosis that guides their treatment and optimizes subsequent care.  The iterative nature of this process is what makes the Swiss cheese model less suited to describe diagnostic error.  As clinicians, we integrate prior information into our decision making process, thus creating the conditions for diagnostic momentum, where a prior diagnosis is accepted without sufficient reassessment or skepticism. Diagnostic error thus more closely resembles a game of dominos than slices of Swiss cheese. As the point of first medical contact, prehospital clinicians therefore have a critical role to play in ensuring diagnostic safety.

As both an EMS educator and a medical director invested in quality improvement, diagnostic error has occupied my thoughts a lot lately.  From a quality improvement perspective, many of our metrics evaluate performance once a diagnosis has been made (e.g. did you perform a blood sugar in a suspected stroke?, what was your scene time for the STEMI?), but as our perspectives have broadened, particularly through examination of care disparities, we can begin to see how the greatest harm can come from diagnostic error, including the failure to consider the diagnosis in the first place.  From a system perspective, interventions to improve diagnostic safety have centered around approaches such as diagnostic support tools, but diagnostic support tools still rely on the clinical judgment of the clinicians who use them. (5)

As medical directors and educators, one of our roles is to support EMS clinicians in developing practices that improve diagnostic safety.  So how do we do this? There is not a lot of strong evidence in this area, but certainly some good ideas.  Here are a few.

 

Value thorough patient assessmentโ€ฆ and reassessment

 

My favorite quote from James Clearโ€™ book Atomic Habits is โ€œyour outcomes are the lagging measure of your habits.โ€(6)  Nowhere does this ring more true for me than in the arena of clinical care and why we aspire to build systems that promote safer habits.  When we think about safe habits, we picture rig checks, checklists and crosschecks.  But approach to patient assessment is also a habit that we build.

 

There is growing concern that as technology takes on an increasingly prominent role in medicine, that the โ€œtraditionalโ€ components of patient assessment receive less attention.  We have all witnessed the error of focusing on and treating a monitor rather than the patient.  As part of initial EMS education, we focus on the differentiation of โ€œsickโ€ vs. โ€œnot sickโ€ as part of the general impression.  But the reality is that in emergency services, โ€œobviously sickโ€ is a rare general impression and the largest population falls into the โ€œmaybe sickโ€ category.  As an emergency physician practicing in an often overcrowded ED with long wait times, it is not the obviously sick patients that feed my anxiety, it is the unknown, โ€œmaybe sickโ€ in my waiting room.

 

So how do we begin to sort out the โ€œmaybe sickโ€? Fundamentally, it is the patient assessment.  Our iterative diagnostic pathways rely on a series of inputs, and if those inputs are woefully incomplete, the accuracy of our conclusions will be compromised.  Time critical diagnoses such as stroke, acute coronary syndrome and sepsis may present as non-descript complaints such as dizziness, weakness, lightheadedness or fall.  It is thorough patient assessment, including history and exam, that lets us sort through our clinical probabilities of significant illness and potentially change a patientโ€™s trajectory through the system.  However, these patients are less likely to have a thorough assessment performed than the obviously sick and have a subsequent diagnostic delay.  As medical directors, this needs to be reinforced in our trainings, quality improvement activities and our actions.  Indeed, what you measure often reflects what you value.  Which of your quality metrics address patient assessment? How do you identify diagnostic errors more systematically? We need to assign as much or more value to the assessment and diagnostic skills of EMS clinicians as their ability to perform and execute specific actions used to define their scope.   

 

Teach clinicians to balance their intuitive and analytic processing abilities.

The dual process model of clinical reasoning asserts that problem solving, such as development of a working diagnosis, is the result of an interplay between intuition and analytical reasoning.(7)  Intuition develops as a result of unconscious application of previously developed mental models in response to recognized patterns or cues.  In prehospital medicine, this enables rapid translation of recognition into action and forms one component of the development of expertise.  Intuition is accurate until is not; It is at this point that analytic reasoning, a conscious and deliberate approach to solving problems, becomes necessary.

As medical directors and educators, we need to not only foster intuition by closing the loop with follow-up on patients with a wide-variety of clinical presentations (and working to integrate the system in so that this becomes more seamless), but also teaching cognitive forcing strategies to help them recognize when heuristics can lead to diagnostic error. (4, 5, 8, 9)  Such cognitive forcing strategies include identifying patient populations or clinical presentations where our structured biases leave us a high risk of diagnostic error.  It also includes helping them develop the habit of patient reassessment and transition to analytical reasoning when gathered data does not fit with their intuitive impression, including when patients do not follow the expected clinical trajectory or respond as expected to intervention.  Teaching metacognition may be as important a learning objective in simulations and case discussions as the pathophysiology involved in the individual cases themselves.

 

Instill the value of learning from failure

Optimizing our intuitive processes requires opportunities for recalibration of mental models.  To be able to optimize clinical judgement, we need to know when we are wrong and value errors as opportunities to improve performance.  Indeed, one of the greatest barriers that exists to improvement in diagnostic safety in EMS is the difficulty in getting patient follow-up.  I often tell my paramedic students that patient follow-up is some of the most valuable feedback you can get.  Yet even with existing difficulties of getting regular follow-up on patient diagnosis and outcome, there remain squandered opportunities for learning and improvement when diagnostic errors are identified because error is seen as failure, but failure is not reframed as opportunity. There are leaders who prefer to tell stories of their successes, but it is actually more important to share what you have learned from failures. (10) This models not only that it is safe to share your mistakes, but also the process for how learning can occur โ€“ for the individual and the system โ€“ as a result of an openly discussed error. Critical examination of diagnostic errors is something that can be practiced during open case reviews and simulations. In practicing this, we succeed developing the skills and culture for continuous improvement.  

The iterative nature of the diagnostic process places enormous responsibility upon EMS clinicians and dynamic environment they work within.  Addressing diagnostic safety is complex and simple at the same time.  Diagnosis is complex, textbook presentations much rarer than the alternative, but the human art of patient assessment, reassessment and re-evaluation, remains central.  While technological tools and system-based solutions can help, I cannot foresee a future where the role of the clinician is completely eliminated. For those practicing in the realm of EMS medicine, thatโ€™s a challenge to which we will always need to rise.    

 

References:

 

1.                   J. Reason, Human error: models and management. BMJ. 320, 768โ€“770 (2000).

2.                   T. V. Perneger, The Swiss cheese model of safety incidents: are there holes in the metaphor? BMC Health Serv Res. 5, 71 (2005).

3.                   Committee on Diagnostic Error in Health Care, Board on Health Care Services, Institute of Medicine, The National Academies of Sciences, Engineering, and Medicine, Improving Diagnosis in Health Care (National Academies Press (US), Washington (DC), 2015; http://www.ncbi.nlm.nih.gov/books/NBK338596/).

4.                   M. L. Graber, S. Kissam, V. L. Payne, A. N. D. Meyer, A. Sorensen, N. Lenfestey, E. Tant, K. Henriksen, K. Labresh, H. Singh, Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 21, 535โ€“557 (2012).

5.                   R. L. Trowbridge, G. Dhaliwal, K. S. Cosby, Educational agenda for diagnostic error reduction. BMJ Qual Saf. 22 Suppl 2, ii28โ€“ii32 (2013).

6.                   J. Clear, Atomic habits: An easy & proven way to build good habits & break bad ones (Penguin, 2018).

7.                   P. Croskerry, Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Advances in health sciences education. 14, 27โ€“35 (2009).

8.                   P. Croskerry, Perspectives on diagnostic failure and patient safety. Healthc Q. 15, 50โ€“56 (2012).

9.                   P. Croskerry, Critical thinking and decisionmaking: avoiding the perils of thin-slicing. Annals of emergency medicine. 48, 720โ€“722 (2006).

10.                 C. G. V. Coutifaris, A. M. Grant, Organization Science, in press, doi:10.1287/orsc.2021.1498.

A version of this article was originally published in NAEMSEโ€™s Educator Update (July 2022).

 

Share:

More Posts

About the Author

Scroll to Top