Cognitive errors in medicine: Mitigation of cognitive errors

Cognitive errors

This is Part 3 of a 4 part series.

  1. Part 1: A brief overview of cognitive theory
  2. Part 2: Common cognitive errors
  3. Part 4: Problems with cognitive theory

Part 2 of this series consisted of a long list of common cognitive errors we are all prone to committing. It can be discouraging to think that, despite all our training and all our effort, our brains are out to get us; that error may be inevitable. However, there may be some ways to mitigate our biases.

This post will focus on a few of the strategies that have been described to deal with the shortcomings of human cognition. The idea is to override intuitive system 1 thinking and ensure that our rational system 2 has had its say. The process of examining your cognition is called various things: metacognition, mindfulness, or self-reflection. The idea is that you should be thinking about your thinking, which will hopefully allow you to identify potential cognitive problems.

Unlike the cognitive biases themselves, which are relatively well described, we don’t fully understand their mitigation. In fact, in some cases they simply may be unavoidable. As is discussed in part 4 of this series, metacognition and cognitive forcing strategies have no evidence base in medicine. Although they have face validity, we should always be wary of theory. These techniques will probably be helpful, but they may not be. In fact, they could even result in harm to our patients. (Imagine that you start asking “what else can this be?” with every patient presenting with classic asthma. The result might be more chest xrays and more CTs, despite the fact that imaging is clearly not required in asthma.)

However, in the absence of evidence, we just have to do our best. We know we are prone to certain errors, so it makes sense to insulate ourselves against those errors. Below are a few of the many described cognitive forcing strategies designed to help physicians avoid or recognize their cognitive biases.

Learning the cognitive biases

The good news is that just by reading this series, you have already accomplished what most experts recommend as the first step: developing an awareness of cognitive errors. The idea is that if you know the cognitive traps, you are more likely to recognize when you are falling into one.

Routinely consider alternatives

Develop the routine of asking yourself: what else could this be? This question helps to prevent premature closure and diagnostic momentum. It might allow you to move beyond representativeness error or availability bias. It is a way to force yourself out of system 1 pattern recognition into a slower system 2 deliberation.

Always attempt to disconfirm

Disconfirmation is the core of the scientific method. Always ask yourself: does anything not fit? This helps you avoid anchoring and confirmation bias by forcing you to actively seek out evidence that might disconfirm your diagnosis.

Cognitively unload

If you expend less energy on memory retrieval tasks, you will have more available for critical thinking. Mnemonics, apps, electronic resources, and pocket books are all great ‘external brains’ that will free up your ‘internal’ brain to make crucial decisions.

Specific training in areas prone to cognitive errors

We know that humans are intrinsically poor at probability and statistical thinking. Specifically teaching common errors and basic probability theory may help prevent errors.

Write out your differential diagnosis

We all recognize that no diagnosis is perfectly certain. The act of writing out a differential diagnosis forces you to consider alternatives. Specifically readdressing the differential after the tests are back makes you consider whether you have adequately ruled out dangerous conditions. It can also remind you to discuss the differential diagnosis and uncertainty with your patients, which will mitigate the effects of misdiagnosis when it occurs.

Cognitive stop points

Develop a routine of stopping at specific points in patient care to consider thinking. I typically try to force myself to stop twice with every patient to consider the differential diagnosis: just before entering the initial orders and just prior to discharge. Other times that a cognitive pause make sense are: when things aren’t progressing as you expect (starting your second vasopressor with no effect), just before moving a patient out of resuscitation (to radiology, to the ICU, or to another facility), and whenever you are involved in patient handovers.

Establish a culture of accountability and feedback

You cannot learn from your errors if you never know about them. Rather than a culture of shame, try to develop a culture that recognizes that all humans make mistakes and attempts to address systemic issues. In emergency medicine, you often won’t hear about misdiagnoses until months later (if at all). A system that encourages rapid feedback will allow you to examine your cognitive biases and work to improve.

Address the systemic factors that increase errors

  • Address fatigue. This is a topic onto itself, but one every emergency physician should have a strategy for, such as appropriately scheduled shifts, casino shifts, adequate time off, naps, and of course caffeine.
  • Minimize interruptions. Not easy in an emergency department, but their are many strategies available. Some departments have developed a ‘cone of silence’, essentially an area where a clinician can sit when she needs to concentrate and no one is allowed to interrupt her.
  • Minimize time pressures. Again, this is not easy to do, but an adequately staffed emergency department is essential to optimal cognitive processing.
  • Acknowledge your emotions. Try to avoid making decisions when viscerally aroused. Recognize when you might have positive or negative feelings towards a patient. If you are angry or overwhelmed, take 2 minutes a step outside for some fresh air before making any important decisions.

Simulation training and mental practice

Designing specific training scenarios that allow cognitive errors to occur and then specifically debriefing focused on any cognitive biases that may have been at play may help prevent those errors in the future. 

One method that is easy to remember and incorporates a lot of the above is to ask 5 questions with every patient you see:

  • What traps might I be falling into?
  • What else can it be?
  • Is there anything that doesn’t fit (disconfirmation)?
  • Is there more than one thing going on?
  • Is this a case where I need to slow down?

Another cognitive forcing strategy that I find very useful when teaching students and residents is the use of the SPIT differential diagnosis. You have to identify the most:

  • Serious diagnosis
  • Probable diagnosis
  • Interesting diagnosis
  • Treatable diagnosis

The act of expanding your differential diagnosis is probably the most important part of this strategy, as it helps to avoid premature closure, anchoring, and search satisfaction. Considering the most serious condition helps you avoid availability bias and playing the odds. Forcing yourself to declare the most probable diagnosis helps you focus on pretest probabilities and avoid base rate neglect. Including the most interesting diagnosis helps avoid a zebra retreat. Finally, considering treatable conditions helps to remind you of actions you need to take, avoiding errors of omission. Running the list forces you to look for disconfirming evidence, helping to combat confirmation bias.

Obviously, some of those goals conflict with each other. You can’t strictly adhere to the base rate while simultaneously avoiding zebra retreat. You still need to use your clinical judgement to arrive at a decision, but by running through the list you should have a better sense of why you made the decision you did.

I use most of these techniques routinely, and teach them to my students. However, I have to admit there is no evidence that I am doing any good – as I will discuss in Part 4.

References

Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. In: Henriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville (MD): Agency for Healthcare Research and Quality (US); 2005 Feb. PMID: 21249816 [Free full text]

Croskerry P. From mindless to mindful practice–cognitive bias and clinical decision making. N Engl J Med. 2013;368:(26)2445-8. PMID: 23802513

Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:(8)775-80. PMID: 12915363

Cite this article as:
Morgenstern, J. Cognitive errors in medicine: Mitigation of cognitive errors, First10EM, September 21, 2015. Available at:
https://doi.org/10.51684/FIRS.733

Leave a Reply

6 thoughts on “Cognitive errors in medicine: Mitigation of cognitive errors”

Discover more from First10EM

Subscribe now to keep reading and get access to the full archive.

Continue reading