Author: Robert McAuley, PhD
Purpose
Establish an understanding of mechanisms that lead individuals to focus on data that reinforces their beliefs to the exclusions of contradictory facts so that students can better apply the principles of evidence-based medicine.
Learning Objectives
(1) Recognize the phenomenon of confirmation bias;
(2) Recognize that bias can occur among individuals seeking objective truths; and
(3) Recognize that increased certainty does not predict accuracy.
A previous Preceptor post, Constructing Memories, discussed how the reconstructive nature of memory and the acquisition of third-party information can alter episodic memories. Specifically, it examined how humans interpret perceptual information based on prior experiences to create stories about at-hand or previously experienced events and then act based on what those stories tell them. This article will examine the mechanisms that lead to over-reliance on evidence supporting pre-existing conclusions and excluding contradictory information.
Early humans developed rapid reasoning abilities to identify threatening situations before the situation became critical. One of humankind's early ancestors may have looked out of the shelter one morning and thought, “I see saber-toothed tiger prints leading up to that tree over there, and there is a definite scent of saber-toothed tiger in the air. In my experience, that means there is a saber-toothed tiger nearby. I’ll stay here by the fire where it’s safe and warm.”
He had no incentive to ensure he wasn’t wrong. He could confirm his hypothesis by looking behind the tree. At best, there was nothing there; at worst, he would end up as the tiger’s breakfast. He was content to accept the evidence confirming the tiger’s presence and stay safe.
The ability to make rapid assessments and act, even if the action meant waiting for the tiger to go away, increased the likelihood that our ancestors would live long enough to reproduce. Humankind continued to be well served by their ability to rapidly collect data and make decisions as they progressed from hunters and gathers to farmers. Those who learned to identify the signs of threatening weather and changes in the seasons knew to seek safety or to ensure they had sufficient supplies until the next growing season. Those unable to access the evidence and act were more likely to end up dead from malnourishment, exposure, or saber-toothed tiger and were unlikely to live long enough to pass their genes to the next generation. The early hunters who identified the signs that indicated the presence of food and water sources were most likely to secure those resources and survive.
The ability to rapidly gather patterns of data and form conclusions is still a critical skill. Frey and Adesman demonstrated that memory of chess pieces and “chess skill varies directly with the amount of chess-specific information in the chess display.”(1) Expert chess players were no better than novices at recalling nonsensical arrangements of chess pieces.(1)
Rikers, Loyens, and Schmidt demonstrated that as physicians develop expertise through clinical experience, they reorganize and encapsulate pathophysiological knowledge to rapidly recognize patterns of signs and symptoms without relying on elaborate pathophysiologic reasoning.(2) These cognitive shortcuts, sometimes referred to as heuristics, allow physicians to diagnose and treat patients quickly. This ability is essential when managing large caseloads and critical care situations where successful outcomes depend on rapid diagnoses and interventions.
The drawback of these shortcuts is the risk of premature closure and an incorrect diagnosis due to a confirmation bias. Specifically, the belief that the most quickly arrived-at diagnoses are the most likely.(3) Nickerson defines confirmation bias as “the seeking of or interpreting of evidence in ways that are partial to existing beliefs, expectations, or a hypothesis in hand.”(4) Neal et al. demonstrated that low-effort cognitive processes are more likely to succumb to bias than effortful processes and argue for a strategy of “consider-the-opposite” when reaching conclusions.(5) Mendel et al. showed that 13% of psychiatrists demonstrated confirmation bias when searching for new information after making a preliminary diagnosis.(6) Elston reminds us that while considering disease prevalence and the patient's clinical history can lead to a more accurate diagnosis, it can also introduce clinical diagnosis and “to keep in mind that your preferred diagnosis could be wrong.”(3)
There are sets of discernible cognitive biases that demonstrate that human reasoners are irrational in predictable, but possibly culturally specific, ways.(7) Clinical researchers like Pat Croskerry have documented common cognitive biases in medicine.(8-10) Jerome Groopman has written accessible accounts that we recommend including the book How Doctor’s Think and the New Yorker article “What’s the Trouble”.(11-12) Groopman’s work is based on Croskerry’s research and his reflections on his own biases in clinical work. Both Croskerry and Groopman offer methods for cognitive debiasing, which will be the topic of a future Preceptor article.■
References
(1) Frey PW, Adesman P. Recall memory for visually presented chess positions. Mem Cognit. 1976;4(5):541-547. doi:10.3758/BF03213216
(2) Rikers RM, Loyens SM, Schmidt HG. The role of encapsulated knowledge in clinical case representations of medical students and family doctors. Med Educ. 2004;38(10):1035-1043. doi:10.1111/j.1365-2929.2004.01955.x
(3) Elston DM. Confirmation bias in medical decision-making. J Am Acad Dermatol. 2020;82(3):572. doi:10.1016/j.jaad.2019.06.1286
(4) Nickerson RS. Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psych. 1998;2(2):175-220. https://doi.org/10.1037/1089-2680.2.2.175
(5) Neal TMS, Lienert P, Denne E, Singh JP. A general model of cognitive bias in human judgment and systematic review specific to forensic mental health. Law Hum Behav. 2022;46(2):99-120. doi:10.1037/lhb0000482
(6) Mendel R, Traut-Mattausch E, Jonas E, et al. Confirmation bias: why psychiatrists stick to wrong preliminary diagnoses. Psychol Med. 2011;41(12):2651-2659. doi:10.1017/S0033291711000808
(7) Doces JA, Wolaver A. Are we all predictably irrational? An experimental analysis. Polit Behav. 2021;43:1205-1226. https://doi.org/10.1007/s11109-019-09579-0
(8) Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775-780. doi:10.1097/00001888-200308000-00003
(9) Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22 Suppl 2(Suppl 2):ii58-ii64. doi:10.1136/bmjqs-2012-001712
(10) Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013;22 Suppl 2(Suppl 2):ii65-ii72. doi:10.1136/bmjqs-2012-001713
(11) Groopman J. How Doctors Think. Houghton Mifflin Harcourt; 2008.
(12) Groopman J. What’s the Trouble. The New Yorker. 2007. https://www.newyorker.com/magazine/2007/01/29/whats-the-trouble