In a study by BMJ Quality & Safety, it was estimated that over twelve million Americans were misdiagnosed by their doctors each year. Obviously, this is a statistically significant amount that may not necessarily be rooted in the incompetence of the doctors but, rather, in the cognitive biases that doctors fall for when they diagnose a patient. The Patient Safety Network of the US Department of Health and Human Services estimates that “about 75% of diagnostic errors have a cognitive component.” When things get hectic, for example in the emergency department, these errors tend to present themselves more often as doctors rush from one patient to another unable to take a step back to evaluate their work. Today, we will focus on two common sources of error that are closely related to each other: anchoring and confirmation bias.

According to Dr. Croskerry, confirmation bias is defined in medicine as the predisposition to mainly seek out verification of a diagnosis, ignoring or discounting evidence that contradicts this diagnosis. Furthermore, he defines anchoring bias as the propensity to settle on a particular diagnosis based on a doctor’s initial impression, failing to take in or consider other diagnoses as more details become available. Both of these biases work hand-in-hand; when you have one, you may very well have another, which makes things significantly worse. Nonmedical examples of confirmation bias include buying a new car (for example a Honda Civic) and suddenly seeing everyone on the road driving that same car. When people are subject to confirmation bias, they see the world through a filter – things that support their beliefs pass through while those that oppose them get caught up and never make it to their consciousness. People remember the hits and forget the misses. Similarly for anchoring bias, if people were asked if they would buy a $100 item and then told that they would receive it for $65, they may consider it to be a great bargain and feel more incentivized to buy it because that $65 price tag seems cheap compared to the $100 anchor. Anchoring bias ties people down to the initial information they received, so they “jump to conclusions” as Dr. Schiff puts it. Therefore, it becomes difficult to adjust to new information that may force them to forget what they knew before.

 

In medicine, these biases have more serious consequences than the nonmedical examples presented above. Misdiagnosis may lead to unnecessary operations or further complications that are sometimes fatal down the road. Doctors may not look for information that helps narrow the possible diagnoses and instead persist in looking for information that supports what they think is the diagnosis. In a study done by PubMed of the National Center for Biotechnology Information, 83% of medical students selectively looked for information to prove one diagnosis, disregarding information that gave different diagnoses (despite being equally as likely), while also using nondiagnostic data (data that does not lean either way) to support their claims. In a different study by the same organization, doctors were 19% more likely to stick with their initial treatment plan when presented with three rather than two other treatment options. Because there were more of these different treatment plans, choosing what was best for the patient got more complex, so doctors just went with what they thought was the best at first or, in other words, the status quo. In another example, specialists sometimes failed to correctly give an alternate diagnosis to a patient referred to them because they were anchored by the diagnosis that the primary care physician had given to them. This resulted in misdiagnoses and malpractice lawsuits down the road. However, the most stunning example was given to me by Dr. Kamal Singh, the chief of nuclear radiology at Kaiser, regarding one of his colleagues.

We had a child seen in the ED with diffuse abdominal pain, starting in the right lower abdomen. Even though his physical exam was normal with normal laboratory data (no elevated white blood cell count) and a normal ultrasound of the appendix, the ER doctor was insistent that the patient most likely had acute appendicitis. This leads us to perform a CT scan which also showed a normal appendix, but there were small lymph nodes near the appendix which were felt to be unrelated. Nevertheless, the ER physician was still not convinced. An alternative diagnosis of constipation was offered based on the CT scan, but ultimately, the patient was taken to surgery where his appendix was found to indeed be normal. The history of pain starting in the right lower abdomen was the basis of anchoring bias for the doctor. He failed to consider other alternatives and pursued his initial diagnosis despite multiple imaging and laboratory data suggesting otherwise. In the end, his persistence ended up leading to unnecessary surgery.

Acute Appendicitis on CT scan with phlegmon formation in the right lower abdomen (not same patient)

Often, doctors commit these biases without even realizing it as they quickly try to form a diagnosis and figure out what’s wrong. However, there are steps they as well as others can take to avoid confirmation and anchoring bias. One of the most important steps is disconfirmation – consciously looking for evidence that refutes the diagnosis the doctor made. The doctor has to look more intently at the misses rather than the hits to put it simply. Another important step is considering and taking into account the sensitivity (percent of time patient will have a certain diagnosis) and specificity (percent of time patient will not have that diagnosis) of diagnostic tools and using ones that have both high specificity and sensitivity. Lastly, going slower and taking a full history of the patient in order to look at the bigger picture can help avoid these mistakes that are usually made in haste. Stepping back and looking at the evidence after a few minutes with fresh eyes often makes a significant difference in regards to cognitive biases like confirmation and anchoring bias. At the end of the day, being a doctor is about providing the highest quality care to the patients. Thankfully, it is easy to avoid these biases if the doctor is vigilant and aware of them so that they can properly diagnose their patients all the time.

 

References

Byrne, John. “Cognitive Biases.” Skeptical Medicine, sites.google.com/site/skepticalmedicine//cognitive-biases#TOC-Confirmation-Bias. Accessed 18 July 2017.

Clear, James. The Confirmation Bias. James Clear, jamesclear.com/common-mental-errors. Accessed 18 July 2017.

Croskerry, Pat. “50 Cognitive and Affective Biases in Medicine.” Saint John Regional Hospital Emergency Medicine, sjrhem.ca/wp-content/uploads/2015/11/CriticaThinking-Listof50-biases.pdf. Accessed 18 July 2017.

ER Visits Jump as Obamacare Kicks In, Doctors Say. Huffington Post, www.huffingtonpost.com/2014/05/21/obamacare-emergency-room_n_5352987.html. Accessed 18 July 2017.

Etchells, Edward. “Anchoring Bias with Critical Implications.” PSNet, psnet.ahrq.gov/webmm/case/350/anchoring-bias-with-critical-implications. Accessed 18 July 2017.

5 Ways to Use the Anchoring Bias to Boost Conversions. The Experiment, www.iaexperiment.com/blog/anchoring-bias/. Accessed 18 July 2017.

Morgenstern, Justin. “Cognitive Errors in Medicine: Common Errors.” First 10EM, first10em.com/2015/09/15/cognitive-errors/. Accessed 18 July 2017.

Pilcher, Charles. “Diagnostic Errors and Their Role in Patient Safety.” Kevin MD, 31 Mar. 2011, www.kevinmd.com/blog/2011/03/diagnostic-errors-role-patient-safety.html. Accessed 18 July 2017.