Editor’s note: The opinions expressed by the author do not necessarily reflect the opinions of the AAMC or its members.
Early in my second year of medical school, my class was given an assignment. We were told that a patient had come into the hospital with a slightly raised blotchy red rash (purpura) and kidney failure. Armed with the fictitious patient’s history, physical exam notes, and lab test results, I promptly went to the medical library and hand searched Index Medicus for papers on patients with this clinical picture. I finally diagnosed the patient with polyarteritis nodosa, an unusual autoimmune disorder.
I was quite proud of myself … until I presented my solution in class. There, I was told that not only was my diagnosis wrong, but that I had missed the most likely cause of the patient’s “palpable purpuric” rash: Henoch-Schonlein Purpura. In hindsight, I realized that I had fallen prey to a common cause of diagnostic error: I had dismissed the more likely diagnosis because the patient’s other symptoms did not quite fit the pattern for that illness.
Since then, I’ve made other diagnostic errors — fewer than I did as a student during my clinical rotations and even fewer as my career progressed. But I’ve continued to be interested in learning about the causes of diagnostic error, in particular the “dual process theory” proposed by psychologists Daniel Kahneman and Amos Tversky in the early 1970s.
Simply put, Kahneman and Tversky theorized that we have two distinct thinking modes known as Systems 1 and 2. System 1 thinking is rapid, intuitive, and based on pattern recognition, while System 2 thinking is slower, more deliberate, and more analytical. In terms of our prehistoric past, System 1 tells us to run the other way when we see a saber tooth tiger coming. System 2 helps when we see some berries we don’t quite recognize and have to decide if they are safe to eat.
Sometimes we stereotype a patient. For example, we might assume that a young, athletic patient could not possibly be having a stroke, or that a patient who apparently is drunk does not have an underlying severe head injury.
In medicine, we first look to see if the patient’s signs and symptoms are characteristic (the System 1 pattern) of a particular disease. This has been called the “illness script,” and we learn more than 100 of these during the first two preclinical years of medical school.
We make mistakes with System 1 thinking when we believe there is a pattern yet none exists. This can be dangerous, and commonly leads to diagnostic errors. How can we avoid seeing these patterns? The short answer is to switch over to the more analytic System 2 thinking process, particularly for students and residents who are just starting their medical careers. Not every disease presents exactly as described in the illness script. There are atypical presentations of common diseases and many areas of overlap between conditions.
Experienced clinicians mostly diagnose using System 1 thinking. They match the patient’s illness characteristics with patterns that are learned in medical school and reinforced through years of experience. And this mostly works fine, leading to the correct diagnosis. But sometimes, we take mental shortcuts that can lead to errors.
The most common of these are:
- Premature closure of the differential diagnosis
- The representative heuristic
- The availability heuristic, and
Premature closure occurs when a clinician thinks that they see a pattern for a specific illness before ascertaining that this illness actually exists. They have short-circuited the process of obtaining a differential diagnosis, comparing diseases on that list with the patient’s presentation and doing whatever other steps are needed to narrow the diagnosis. This most common cause of diagnostic error occurs when a clinician “anchors” to a particular diagnosis too early in the clinical encounter and fails to adjust even when presented with conflicting information that doesn’t match the anchored diagnosis or the illness script.
The representative heuristic happens when the clinician sees the patient’s symptoms as matching the illness script for one disease, when another illness with similar characteristics is present.
The availability heuristic is the tendency to make a diagnosis that is top-of-mind, either because the symptoms resemble a recently missed diagnosis or one that was recently presented in an educational setting, for instance. In this case, we might overestimate the true likelihood of disease because it was more easily retrieved from our memory.
And lastly, stereotyping. Sometimes we stereotype a patient. For example, we might assume that a young, athletic patient could not possibly be having a stroke, or that a patient who apparently is drunk does not have an underlying severe head injury.
Communicating uncertainty can be intimidating and difficult, but it’s far more important to get the diagnosis right than to rush into a diagnosis that doesn’t help the patient and may do harm.
We are not powerless to prevent these errors. First, we should accept that we all make cognitive slips that can lead to diagnostic errors. These are more likely to happen in emergencies or chaotic situations when we are multitasking or have frequent interruptions.
It is especially important in these situations, then, to intentionally switch from System 1 to System 2 thinking, especially when presented with a patient whose symptoms don’t quite seem to fit the illness script. This “diagnostic time out” is analogous to the “time out” before surgery, and it’s critically important when making a diagnosis – as a student or as an experienced clinician.
Students might try formally making a list of pros and cons: Which findings confirm the diagnosis, and which are contrary? What are the diseases that present this way that are potentially life- or limb-threatening and cannot afford to be missed?
For a patient with chest pain, for instance, you cannot afford to miss coronary artery disease, pulmonary embolism, pneumothorax, pneumonia, pericarditis, or aortic dissection. All of these can be ruled out with a good patient history, physical examination, basic laboratory tests, EKG, and chest X-ray.
In the interim, it’s always OK to tell your patient, “I’m not quite sure what you have, but I’m working on it.” Communicating uncertainty can be intimidating and difficult, but it’s far more important to get the diagnosis right than to rush into a diagnosis that doesn’t help the patient and may do harm.
Dan Meyer, MD, is a retired professor of emergency medicine at the Albany Medical College and an associate editor for MedEdPORTAL© in the field of emergency medicine. He gives frequent workshops on the prevention of diagnostic error at meetings of the Society for Academic Emergency Medicine, Association for Medical Education in Europe, and International Association for Medical Science Education.
Editor’s note: Those interested in learning more about preventing diagnostic errors can attend a session at the upcoming Learn Serve Lead 2019: The AAMC Annual Meeting titled “Improving Diagnostic Performance and Safety: A Call to Action for Education and Practice.” In addition, the AAMC has been working with the Society to Improve Diagnosis in Medicine (SIDM) to advance medical education on diagnostic safety and is collaborating on a session about improving diagnosis by enhancing competency-based education that will take place during SIDM's Diagnostic Error in Medicine conference, DEM2019.