Physicians are human and, therefore, constantly vulnerable to cognitive bias. But this imperfection isn’t just theoretical. It can have huge effects on patient care.
An article published in the AMA Journal of Ethics® (@JournalofEthics) by Tiffany S. Doherty, PhD, a post-doctoral researcher, and Aaron E. Carroll, MD, MS, professor of pediatrics and associate dean for research mentoring, at Indiana University School of Medicine in Indianapolis summarized the most common cognitive biases physicians face in practice. The authors also outlined how these realities can affect clinical decision-making and how educators can work to decrease bias-related errors.
Cognitive biases are worrisome for physicians because they can affect one’s ability to gather evidence, interpret evidence, take action and evaluate their decisions, the authors noted. Here are four biases that commonly surface in medicine.
Confirmation bias involves selectively gathering and interpretation evidence to conform with one’s beliefs, as well as neglecting evidence that contradicts them. An example is refusing to consider alternative diagnoses once an initial diagnosis has been established, even though data, such as laboratory results, might contradict it.
“This bias leads physicians to see what they want to see,” the authors wrote. “Since it occurs early in the treatment pathway, confirmation bias can lead to mistaken diagnoses being passed on to and accepted by other clinicians without their validity being questioned, a process referred to as diagnostic momentum."
Anchoring bias is much like confirmation bias and refers to the practice of prioritizing information and data that support one’s initial impressions of evidence, even when those impressions are incorrect. Imagine attributing a patient’s back pain to known osteoporosis without ruling out other potential causes.
Affect heuristic describes when a physician’s actions are swayed by emotional reactions instead of rational deliberation about risks and benefits. It is context or patient specific and can manifest when physician experiences positive or negative feelings toward a patient based on prior experiences.
Outcomes bias refers to the practice of believing that clinical results—good or bad—are always attributable to prior decisions, even if the physician has no valid reason to think this, preventing him from assimilating feedback to improve his performance.
“Although the relation between decisions and outcomes might seem intuitive, the outcome of a decision cannot be the sole determinant of its quality; that is, sometimes a good outcome can happen despite a poor clinical decision, and vice versa,” the authors wrote.
“Simply increasing physicians’ familiarity with the many types of cognitive biases—and how to avoid them—may be one of the best strategies to decrease bias-related errors,” the authors wrote, noting that medical education “could fruitfully invest in training on cognitive biases, the role they play in diagnostic and treatment errors, and effective debiasing strategies."
A recent systematic review of cognitive intervention studies showed two education strategies may help improve diagnostic outcomes:
Reflection reinforces behaviors that reduce bias in complex situations. Guided reflection interventions have been associated with the most consistent success in improving diagnostic reasoning.
"A guided reflection intervention involves searching for and being open to alternative diagnoses and willingness to engage in thoughtful and effortful reasoning and reflection on one’s own conclusions, all with supportive feedback or challenge from a mentor,” the authors wrote.
Cognitive forcing strategies involve conscious consideration of alternative diagnoses that don’t come intuitively. One example is reading radiographs in the emergency department, where inexperienced physicians often call off the search for a diagnosis once a positive finding has been noticed.
This rush to judgment “often leads to other abnormalities (e.g., second fractures) being overlooked,” the authors wrote. “Thus, the forcing strategy in this situation would be to continue a search even after an initial fracture has been detected.”
If these measures are to work, “we must consistently include them in medical curricula,” the authors noted. “During medical education and consistently thereafter, we must provide physicians with a full appreciation of the cost of biases and the potential benefits of combatting them.”
The September 2020 issue of the AMA Journal of Ethics further explores, in print and podcast, behavioral architecture in health care.