Cognitive theory in medicine: Some problems

Cognitive errors

Finally, part 4 of this 4 part series will explore some of the problems with the current description of cognitive theory and attempting to apply what we know about cognitive biases in medicine.

Cognitive biases are an increasingly popular topic of discussion in the medical community. They are fascinating to read about. It is fun to consider our fallibility when we are so often expected to be perfect. Most descriptions of the topic focus on the content of the first three posts: dual process theory, biases, and possible solutions. However, there are several problems with dual process theory and its application to medicine that must be acknowledged.

The key message is that human cognition is incredibly complex and that simple discussions of a fast and slow system are probably inadequate. The debate about dual process theory is highly nuanced and I can’t do it justice here. (I may not understand it well enough to do it justice anywhere.) However, there are a few key points physicians should consider when teaching and applying cognitive theory in their practice.

First, remember that despite being the popular theory, dual process theory is not the only theory of human cognition. The cognitive science literature is replete with alternative theories and evidence of the failings of dual process theory. Dual process theory is intuitive and applies well in simple psychological experiments involving undergraduate students, but it is not clear how well it applies to complex human reasoning, especially at the expert level. The clean distinction of thinking into system 1 and system 2 is questionable.

A major problem with dual process theory is the poorly defined interface between systems 1 and 2. In general, the process is simplified by saying that system 1 is always active, unconsciously and rapidly sorting through the world, while system 2 monitors system 1 and makes corrections as necessary. However, it is not clear how that monitoring happens. What triggers the transition from system 1 to system 2? Although system 2 is supposed to monitor system 1, the act of monitoring would seem to require rapid analysis of ongoing thought and pattern recognition of possible errors. In other words, the monitoring of system 1 sounds like a system 1 process.

Additionally, the common description of system 1 as unconscious and system 2 as conscious doesn’t fit well with current understandings of neuroscience. All conscious thought seems to start off, if only fractions of a second before, as unconscious thought. What this means for the distinction between two systems of thinking is unclear.

Although the behaviour of undergraduate students in psychological experiments is well described by dual process theory, it is not clear that complex expert reasoning fits as neatly into the two distinct systems of reasoning. Medical experts use heuristics to rapidly and efficiently shape deliberate thought, a cognitive pattern that does not neatly fit into classic dual process theory.

Furthermore, the valuation of system 2 thinking as more accurate than system 1 may not always be true. A number of studies of medical decision making have shown that rapid (system 1) decision making is often more accurate than slower, deliberate thought. Also, although the role of system 2 is supposed to be to overrule system 1, there are examples of the process occurring in reverse with good results. All emergency doctors have met patients that looked good on paper, or when thought about analytically, but who we knew were sick based on our gestalt. It would be a mistake to discount the value of these heuristics.

When I say that there are problems with dual process theory, I don’t mean that system 1 and 2 don’t exist. There is a lot of good research that illustrates these two forms of thinking. There are certainly times we are on cruise control and other times that we are deliberate and analytical. However, the complete separation of the two processes may be incorrect. I certainly don’t have the answers, but I think it’s important to recognize that human cognition is a lot more complex than most recent popular medical books on the subject would lead us to believe.

That might be more cognitive science that most people want to think about, but I think it is important to recognize there is not a clean division in human thought. Fixing cognitive errors will not be as simple as turning off system 1 and letting the analytic, rational system 2 take over. There is nothing inherently good about system 2, nor is there anything bad about system 1.

In medicine we know to question physiologic reasoning and instead to rely on empiric evidence. Likewise, we should probably be wary of all this cognitive theory and instead ask ourselves: what is the evidence this model helps our patients? If dual process theory turns out to be wrong, but the tools we developed from it still help us make better diagnoses, we should use them. However, it doesn’t matter how great a theory it is, if none of the resulting practices help our patients.

So far, unfortunately, the empiric analysis of cognitive theory in medicine has been disappointing. (Even outside of medicine, there is very little evidence that metacognition helps improve our cognitive biases.) Within medicine, the best answer is probably that we just don’t know. Jonathon Sherbino (@sherbino) and some of his colleagues have provided us with, as far as I know, the world’s literature on attempting to address cognitive errors in medicine. He has conducted a number of small studies focused on cognitive forcing strategies and none has showed an improvement in clinical reasoning. For example, in one controlled trial resident physicians were presented with internal medicine cases and either told to answer as quickly as possible or to be careful, thorough, and reflective. Accuracy was identical whatever the instructions. (PMID 24362377) In another study, accuracy when presented with a clinical vignette was correlated with speed – the faster one answered (consistent with use of system 1) the more accurate one was. (PMID 22534592) Even 4 weeks of training in cognitive forcing strategies did not change the accuracy of diagnosis. (PMID 24423999) However, these studies all share the common shortcoming that they studied learners rather than experts, who we know approach diagnostic processes differently.

We should also be careful in interpreting the current research describing cognitive biases in medicine. It is easy to recognize an error and retrospectively analyze the causes of that error. However, it is very difficult to recognize and analyze good decision making. Therefore, the literature is probably skewed by hindsight bias. We may identify one case of a missed myocardial infarction and blame the representativeness bias because it occurred in an otherwise healthy 18 year old girl. However, we will fail to recognize the 100 correctly diagnosed MIs, and even more importantly, the 1000s of young females with chest pain but without MI who were all correctly diagnosed based on their representation of typical prototypes.

It is tempting to accept any strategy to mitigate cognitive biases in medicine as inherently valuable, but I think that is clearly wrong. Many of the described biases exist at opposite ends of a spectrum. Therefore, if you are consciously correcting for one form of bias you may be simultaneously increasing your chance of falling for another. For example, if you recognize that you are retreating from the diagnosis of a ‘zebra’ and actively attempt to correct your thinking, you may quickly shift into a scenario in which you are ignoring the base-rate of this rare condition. You can’t save the patient with a rare disease unless you think about it, but it if you think about it too often you will harm the many patients with common diseases.

Although I present a number of cognitive forcing strategies in part 3 of this series, I think it is important to remember that these tools are unproven. Although they seem intuitively valuable, so did anti-arrythmics for PVCs. In the quest to miss less, we will inevitably test more. If these tools, rather than catching our rare errors, result in our increasing testing in patients below a test threshold, we will end up causing harm. I think we need to be aware of these biases, not because we will be able to eliminate error from medicine, but as a reminder that we are all fallible.

References

Eva KW, Norman GR. Heuristics and biases–a biased perspective on clinical reasoning. Medical education. 39(9):870-2. 2005. PMID: 16150023

Graber M. Metacognitive training to reduce diagnostic errors: ready for prime time? Academic medicine : journal of the Association of American Medical Colleges. 78(8):781. 2003. PMID: 12915364

Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Archives of internal medicine. 165(13):1493-9. 2005. PMID: 16009864

Gigerenzer G, Goldstein DG. Reasoning the fast and frugal way: models of bounded rationality. Psychological review. 103(4):650-69. 1996. PMID: 8888650

Norman GR, Eva KW. Diagnostic error and clinical reasoning. Medical education. 44(1):94-100. 2010. PMID: 20078760

Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89:(2)277-84. PMID: 24362377

Monteiro SD, Sherbino J, Patel A, Mazzetti I, Norman GR, Howey E. Reflecting on Diagnostic Errors: Taking a Second Look is Not Enough. J Gen Intern Med. 2015. PMID: 26173528

Redelmeier DA, Shafir E, Aujla PS. The beguiling pursuit of more information. Medical decision making : an international journal of the Society for Medical Decision Making. 21(5):376-81. 2001. PMID: 11575487

Sherbino J, Dore KL, Wood TJ, et al. The relationship between response time and diagnostic accuracy. Acad Med. 2012;87:(6)785-91. PMID: 22534592

Sherbino J, Kulasegaram K, Howey E, Norman G. Ineffectiveness of cognitive forcing strategies to reduce biases in diagnostic reasoning: a controlled trial. CJEM. 2014;16:(1)34-40. PMID: 24423999

Sherbino J, Yip S, Dore KL, Siu E, Norman GR. The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study. Teach Learn Med. 2011;23:(1)78-84. PMID: 21240788

Cognitive biases in medicine

Cite this article as:
Morgenstern, J. Cognitive theory in medicine: Some problems, First10EM, September 22, 2015. Available at:
https://doi.org/10.51684/FIRS.742

Discover more from First10EM

Subscribe now to keep reading and get access to the full archive.

Continue reading