In the post below, I ask you to make a diagnosis of a medical condition. Most people get it wrong, probably because the actual diagnosis is far removed from the setting presented. People apply their inductive forces to a new problem, based on probabilistic inferences from other situations with which they are more familiar.
I attended a seminar on Friday at which MIT's Joshua Tenenbaum presented a theoretical basis for this learning process. If you subscribe to Science Magazine, you can read his recent article on the topic: "How to Grow a Mind: Statistics, Structure, and Abstraction."
It turns out that people are reasonably good at inference, from a very young age, as Joshua notes:
Generalization from sparse data is central in learning many aspects of language, such as syntactic constructions or morphological rules. It presents most starkly in causal learning: every statistics class teaches that correlation does not imply causation, yet children routinely infer causal links from just a handful of events, far too small a sample to compute even a reliable correlation!
In a more theoretical section, the author describes a probabilistic, or Baysian, model to explain this learning process:
How does abstract knowledge guide inference from incomplete data? Abstract knowledge is encoded in a probabilistic generative model, a kind of mental model that describes the causal processes in the world giving rise to the learner's observations as well as unobserved or latent variables that support effective prediction and action if the learner can infer their hidden state. . . . A generative model . . . describes not only the specific situation at hand, but also a broader class of situations over which learning should generalize, and it captures in parsimonious form the essential world structure that causes learners' observations and makes generalizations possible.
Except when it doesn't work! As several of you demonstrated below, that same probabilistic model can lead to cognitive errors.
I summarized Pat Croskerry's explanation below:
Croskerry's exposition compares intuitive versus rational (or analytic) decision-making. Intuitive decision-making is used more often. It is fast, compelling, requires minimal cognitive effort, addictive, and mainly serves us well. It can also be catastrophic in that it leads to diagnostic anchoring that is not based on true underlying factors.
Why the dichotomy? How can a learning process that works so well in some cases led us awry in others? I asked Joshua, and he suggested that it might have to do with the complexity of the issue. For those functions that were important in an evolutionary sense as humans evolved -- e.g., recognizing existential threats, sensing the difference between poisonous and healthy plants -- a quick probabilistic inference was all that mattered.
Now, though, in a complex society, perhaps we get trapped by our inferences. The sense of tribalism that led us to flee from -- or fight -- people who looked different and who might have been seeking to steal our territory or food becomes evident now as unsupported and destructive racial or ethnic prejudice.
Likewise, the diagnostic approach to illness or injury that might have sufficed with simple health threats 10,000 years ago no longer produces the right result in a more complex clinical setting. Think about it. If you were a shaman or healer in a tribe, most conditions or illnesses healed themselves. You recognized the common ailments, and you knew you didn't need to do much, and whatever herbs or amulets or incense you used did no harm. If you couldn't cure the disease, you blamed the evil spirits.
In contrast, as a doctor today, you are expected to apply an encyclopedic knowledge to a variety of complex medical conditions -- cancer, cardiovascular disease, liver and kidney failure -- that were relatively unknown back then. (You were more likely to die from something more simple at a much younger age!) Many cases you see today have a variety of symptoms and multivariate causes and different possible diagnoses. It is no surprise that your mind tries to apply -- in parsimonious form -- a solution. The likelihood of diagnostic anchoring is actually quite high, unless you take care. As I note below:
Croskerry thinks we need to spend more time teaching clinicians to be more aware of the importance of decision-making as a discipline. He feels we should train people about the various forms of cognitive bias, and also affective bias. Given the extent to which intuitive decision-making will continue to be used, let's recognize that and improve our ability to carry out that approach by improving feedback, imposing circuit breakers, acknowledging the role of emotions, and the like.
I attended a seminar on Friday at which MIT's Joshua Tenenbaum presented a theoretical basis for this learning process. If you subscribe to Science Magazine, you can read his recent article on the topic: "How to Grow a Mind: Statistics, Structure, and Abstraction."
It turns out that people are reasonably good at inference, from a very young age, as Joshua notes:
Generalization from sparse data is central in learning many aspects of language, such as syntactic constructions or morphological rules. It presents most starkly in causal learning: every statistics class teaches that correlation does not imply causation, yet children routinely infer causal links from just a handful of events, far too small a sample to compute even a reliable correlation!
In a more theoretical section, the author describes a probabilistic, or Baysian, model to explain this learning process:
How does abstract knowledge guide inference from incomplete data? Abstract knowledge is encoded in a probabilistic generative model, a kind of mental model that describes the causal processes in the world giving rise to the learner's observations as well as unobserved or latent variables that support effective prediction and action if the learner can infer their hidden state. . . . A generative model . . . describes not only the specific situation at hand, but also a broader class of situations over which learning should generalize, and it captures in parsimonious form the essential world structure that causes learners' observations and makes generalizations possible.
Except when it doesn't work! As several of you demonstrated below, that same probabilistic model can lead to cognitive errors.
I summarized Pat Croskerry's explanation below:
Croskerry's exposition compares intuitive versus rational (or analytic) decision-making. Intuitive decision-making is used more often. It is fast, compelling, requires minimal cognitive effort, addictive, and mainly serves us well. It can also be catastrophic in that it leads to diagnostic anchoring that is not based on true underlying factors.
Why the dichotomy? How can a learning process that works so well in some cases led us awry in others? I asked Joshua, and he suggested that it might have to do with the complexity of the issue. For those functions that were important in an evolutionary sense as humans evolved -- e.g., recognizing existential threats, sensing the difference between poisonous and healthy plants -- a quick probabilistic inference was all that mattered.
Now, though, in a complex society, perhaps we get trapped by our inferences. The sense of tribalism that led us to flee from -- or fight -- people who looked different and who might have been seeking to steal our territory or food becomes evident now as unsupported and destructive racial or ethnic prejudice.
Likewise, the diagnostic approach to illness or injury that might have sufficed with simple health threats 10,000 years ago no longer produces the right result in a more complex clinical setting. Think about it. If you were a shaman or healer in a tribe, most conditions or illnesses healed themselves. You recognized the common ailments, and you knew you didn't need to do much, and whatever herbs or amulets or incense you used did no harm. If you couldn't cure the disease, you blamed the evil spirits.
In contrast, as a doctor today, you are expected to apply an encyclopedic knowledge to a variety of complex medical conditions -- cancer, cardiovascular disease, liver and kidney failure -- that were relatively unknown back then. (You were more likely to die from something more simple at a much younger age!) Many cases you see today have a variety of symptoms and multivariate causes and different possible diagnoses. It is no surprise that your mind tries to apply -- in parsimonious form -- a solution. The likelihood of diagnostic anchoring is actually quite high, unless you take care. As I note below:
Croskerry thinks we need to spend more time teaching clinicians to be more aware of the importance of decision-making as a discipline. He feels we should train people about the various forms of cognitive bias, and also affective bias. Given the extent to which intuitive decision-making will continue to be used, let's recognize that and improve our ability to carry out that approach by improving feedback, imposing circuit breakers, acknowledging the role of emotions, and the like.
No comments:
Post a Comment