I was chatting with Toby Ord recently about a series of events we think we’ve observed in ourselves:
In analyzing a how the future might develop, we set aside some possible scenarios for various reasons: they’re harder to analyze, or they’re better outsourced to some domain expert at Cambridge, or whatever.
Some time passes, and we forget why we set aside those scenarios.
Implicitly, our brains start to think that we set aside those scenarios because they were low probability, which leads us to intuitively place too much probability mass on the scenarios we did choose to analyze.
I wish I had an intuitive name for this, which made analogy to some similar process, ala “evaporative cooling of group beliefs.” But the best I’ve heard so far is the “pruning effect.”
It may not be a special effect, anyway, but just a particular version of the effect whereby a scenario you spend a lot of time thinking about feels intuitively more probable than it should.
I was chatting with Toby Ord recently about a series of events we think we’ve observed in ourselves:
In analyzing a how the future might develop, we set aside some possible scenarios for various reasons: they’re harder to analyze, or they’re better outsourced to some domain expert at Cambridge, or whatever.
Some time passes, and we forget why we set aside those scenarios.
Implicitly, our brains start to think that we set aside those scenarios because they were low probability, which leads us to intuitively place too much probability mass on the scenarios we did choose to analyze.
I wish I had an intuitive name for this, which made analogy to some similar process, ala “evaporative cooling of group beliefs.” But the best I’ve heard so far is the “pruning effect.”
It may not be a special effect, anyway, but just a particular version of the effect whereby a scenario you spend a lot of time thinking about feels intuitively more probable than it should.
Well, it’s a pretty clear instance of the availability heuristic.