"Uncertainty sometimes is essential for success."
A scientist talking? An NFL coach? A four-star general?
Actually, Jerome Groopman, a physician at Harvard Medical School who also writes for The New Yorker. The words are from his new book, How Doctors Think, which is actually about how doctors diagnose illnesses. It turns out there are lessons in there for managers as well, which is why Harvard Business School’s "Working Knowledge" also has a piece on it.
How Doctors Think focuses, understandably, on cases where doctors got the diagnosis wrong, and attempts to map these errors systematically into various "failure modes" of reasoning, in the hope of making the mistakes more recognizable and avoidable. As Groopman puts it, "most errors [in medicine] are mistakes in thinking." Thus we have:
- "Attribution errors," where cognitive, rational thinking is influenced by stereotypes. To take a negative example, it’s when five doctors over the course of 15 years all fail to diagnose an endocrinologic tumor causing peculiar symptoms in “a persistently complaining, melodramatic menopausal woman who quite accurately describes herself as kooky.” To take a "positive" example (but one no less menacing to the patient), an emergency room doctor misses unstable angina in a forest ranger because “the ranger’s physique and chiseled features reminded him of a young Clint Eastwood—all strong associations with health and vigor.”
- "Momentum errors," where a previous diagnosis "like a boulder rolling down a mountain, gains enough force to crush anything in its way.” An "enabler" of momentum errors is "confirmation bias," meaning our preference to emphasize data that supports our pre-existing view and discount data that contradicts or undermines it.
- "Availability bias," meaning the reflexive reach for a diagnosis that appears instantly obvious, and the failure to withhold judgment in light of the possibility of the same symptoms being caused by an entirely different mechanism: For example, diagnosing pneumonia when the real culprit was an aspirin overdose.
How, then, do we guard ourselves, as physicians or more realistically as managers against these psychological pitfalls?
Essentially, we need to insist that we ourselves exercise the most rigorous critical thinking we can muster. “It is a matter,” Groopman writes, “of juggling seemingly contradictory bits of data simultaneously in one’s mind and then seeking other information to make a decision, one way or another. This juggling . . . marks the expert physician — at the bedside or in a darkened radiology suite.”
Don’t jump to conclusions.
Be aware of how your own emotions towards the subject—say, a mild antipathy to your partner proposing a novel initiative—will color your own reaction.
Ask simple questions, such as "What’s the worst that could happen?" "What else might explain this?" "What doesn’t fit?" "Could there be more than one explanation?"
Don’t accept lazy explanations: "Profits were down in my practice group because we were less busy."
Permit yourself to say out loud: "I haven’t figured it out yet."
Insist on utter clarity in communication. "Tell me again?" And actively encourage others to engage. Ask open-ended questions. Wake people up (mentally).
Make, in other words, the right business diagnosis.