This is Part 1 of a 4 part series. The future holds:
Emergency medicine, despite our love of action and procedures, is primarily a cognitive profession. We see patients with vague, undifferentiated symptoms and have to rapidly and accurately arrive at a diagnosis and management plan. Unfortunately, despite our best efforts, we occasionally err. Cognitive biases, rather than knowledge deficits, are thought to be the primary cause of our errors.
Cognitive biases are predictable, systematic errors in cognition. The word “bias” was originally used in the game of bowls to describe the deliberate weighting on one side of a bowl to give it a tendency to deviate from a straight line. In cognitive science, a bias is a predictable tendency in thinking to favour one perspective over others. Biases diminish the accuracy of an observation, but not necessarily its precision.
You may be tempted to think that these biases don’t affect you, even if you can recognize them in others. The blind spot bias describes the tendency to fail to recognize one’s own errors and it appears to be hardwired into the human mind. However, the tendency towards cognitive errors is not a comment on intelligence. Rationality and intelligence are different. Just like intelligence won’t allow you to solve advanced mathematical algorithms without training in mathematics, you will not master rationality without studying its processes and known errors.
The dominant theory of human cognition posits that cognitive processes consist of two distinct systems. System 1 represents intuitive, unconscious reasoning that relies on heuristics or mental shortcuts. It is quick and requires minimal effort. Consequently, it is the process used most frequently. However, because of its reliance on heuristics, system 1 is prone to bias and cognitive errors. System 2, on the other hand, represents conscious, analytic thought. It is slow, deliberative, and requires significant effort. It is generally thought to be less prone to errors. If you want to sounds smart, or quickly look this up when nobody’s looking, this is called “dual-process theory”.
You may also run across the term “cognitive disposition to respond” (or CDR). The term is used by some because it is thought to have fewer negative connotations than “cognitive bias”. However, I find the term unwieldy and unnecessarily vague, so I will stick with “bias”. We all have biases. It is counterproductive to pretend otherwise.
Before we explore the various cognitive biases that we all fall prey to, it is important to note that there is nothing intrinsically bad about either system of cognition. Heuristics are essential in that they allow us to be efficient and make quick decisions in complex scenarios. However, they also be prone to the various biases described in part two of this series and therefore can lead to errors.
Although we tend to focus on System 1 as the source of our errors, system 2 in isolation is not perfect either. Consider the expert resuscitationist leading the resuscitation of a dying trauma patient. If you were the patient, would you prefer a doctor who slowly considers each textbook diagnosis and applies Bayesian reasoning to methodologically work out the likelihood of various injuries, or would you want a doctor who quickly analyses the scene at a glance and then begins to treat your life threatening injuries based on her expert heuristics? You cannot learn to shoot a free throw simply by reading a book about basketball. It requires practice. Similarly, you cannot become an expert resuscitationist by simply reading textbooks – you need practice.
In fact, there is reason to believe most expert cognition relies on heuristics. This is part of the reason it has always been so hard for expert clinicians to explain their diagnostic reasoning to medical students – they aren’t necessarily following explicit analytic pathways, but rather relying on gestalt. Emergency physicians are expected to make diagnostic decisions in very limited time. A delayed diagnosis, even if the delay was for the sake of accuracy, may be futile if the patient has already deteriorated. Pattern recognition is efficient and often extremely accurate, but unfortunately we know there are some situations in which it can fail us:
- The pattern might be misidentified. A rash that looks a lot like classic shingles might actually be an unlikely distribution of poison oak.
- The practitioner may not have seen enough cases to develop an accurate pattern. You may have seen 10,000 sore throats and developed a very rapid diagnostic pattern recognition approach, but if you have never seen a case of Lemierre’s disease, you will miss that pattern.
- The pattern may not be classic. The elderly woman presenting with fatigue does not fit our classic pattern of ACS and is more likely to be missed. We are also likely to miss diseases that present very early, before the classic pattern develops, or diseases for which the classic pattern significantly overlaps with other diseases.
- Our vigilance over the shortcomings of pattern recognition may become compromised. For example, you might recognize GERD by its pattern, but remain constantly vigilant for a myocardial infarction with a similar presentation. However, at 3am after 2 horrendous resuscitations, that vigilance may be compromised, resulting in misdiagnosis.
In almost everything we do, there is likely to be a tradeoff between the efficiency of heuristics and the potential accuracy of critical thinking. Neither is intrinsically good or bad. Both are necessary.
I will briefly note that although dual process theory makes intuitive sense and works well when applied to controlled psychological experiments, there are numerous failings when it is applied to real, complex human cognition. A few of these problems will be explored further in part 4 of this series. Also, although the majority of the literature has been focused on dual process theory, there are other theories of human cognition that may ultimately fit better. For example, the integrated or reciprocal theory, based primarily on the work of Howard Margolis, posits that rational thought is the result of system 1 and system 2 constantly working together, rather than the separate and even adversarial relationship described by dual process theory.
If you really want to learn to about the intricacies and failings of human cognition, the Nobel Laureate Daniel Kahneman is the person to turn to. I would suggest starting with his famous book: Thinking, Fast and Slow. If you would prefer an easier read, David McRaney is an excellent science reporter who has written two fantastic books on human cognitive biases: You are not so smart and You are now less dumb.
In part 2 of this series I will provide descriptions of the common cognitive errors we commit in medicine.
Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14 Suppl 1:27-35. PMID: 19669918
Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. In:
Croskerry P. From mindless to mindful practice–cognitive bias and clinical decision making. N Engl J Med. 2013;368:(26)2445-8. PMID: 23802513
Groopman, J. (2008). How Doctors Think, Houghton Mifflin Harcourt. Henriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville (MD): Agency for Healthcare Research and Quality (US); 2005 Feb. PMID: 21249816 [Free full text]
Kahneman, D. (2011). Thinking, Fast and Slow, Doubleday Canada.
Norman GR, Eva KW. Diagnostic error and clinical reasoning. Medical education. 44(1):94-100. 2010. PMID: 20078760
Tversky A, Kahneman D. Judgment under Uncertainty: Heuristics and Biases. Science. 1974;185:(4157)1124-31. PMID: 17835457