Surprise! They think like everybody else.
This is based on several articles, my own experience, and long-time interest in the topic.
‘System-1 and system-2’
This is the terminology of Kahneman:
- System-1 is the ‘fast thinking’ ― more intuitive if you like.
- System-2 is the ‘slow thinking’ ― more formal reasoning, hypothesis-testing.
Nr. 1 is more subconceptual; nr. 2 is more conceptual. Underlying both is some kind of pattern recognition.
Decades ago, Nobel Prize-winner D. Kahneman posited the two systems with the immediate warning that they are not neatly separated. He admonished us to use the terms only as a way of speaking. There are no two systems in the brain/mind. This perfectly agrees with subconceptual and conceptual mental processing, forming parts of a continuum.
There are not two different doctors, being the system-1- and system-2-doctor. In reality, the landscape of medical thinking is diverse. There is indeed a preponderance for one part of the landscape or the other.
The best way is to be excellent over the entire landscape.
This also entails knowing when it’s best to go to which part of the landscape ― elegantly changing one’s position when needed.
Novices incline more toward the second; experts more toward the first. The latter tends to be quicker, yet there is a risk of specific biases that may go unnoticed: biases of availability, anchoring, representativeness, hindsight, overconfidence, etc.
As a result also, experts tend to know less about how they specifically come to their conclusions. They think differently from how they think they think. Paradoxically, they may be unaware of this ― using heuristics (shortcuts in thinking) covertly, even to themselves.
This can cause problems when 1) the expert teaches the novice, and 2) the expert is asked for his knowledge to put this in a computerized expert system. That is why the ‘knowledge acquisition bottleneck’ became one of the biggest hurdles for the success of such systems in the eighties and nineties.
Diagnosis ‘by exclusion’
Experts tend to think quickly, using many shortcuts, for a reason. Putting a rash diagnosis and therapy has advantages:
- It gives an impression of competence and firm decision-making.
- This heightens the placebo effect of whatever has been done.
- There’s a big chance the patient heals anyway, by his own strength.
- It might be the right diagnosis and therapy.
- If it doesn’t work, the diagnostic process may continue by exclusion.
As said, physicians think like everybody else. That’s OK as long as it’s acknowledged. Shortly, computerized systems may be used to support the thinking process: medical decision support systems.
I used to head the building of one… 30 years ago. It wasn’t a commercial success.
Maybe the time is ripe today?
Kahneman, D., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press.
Kassirer, J. P., & Kopelman, R. I. (1991). Learning clinical reasoning. Baltimore: Williams & Wilkins.
Elstein AS. Thinking about diagnostic thinking: a 30-year perspective. Adv Health Sci Educ Theory Pract. 2009 Sep;14 Suppl 1:7-18. doi: 10.1007/s10459-009-9184-0. Epub 2009 Aug 11. PMID: 19669916.
Cosby K. The role of certainty, confidence, and critical thinking in the diagnostic process: good luck or good thinking? Acad Emerg Med. 2011 Feb;18(2):212-4. doi: 10.1111/j.1553-2712.2010.00979.x. PMID: 21314782.