If most failures of rationality are adaptively self-serving motivated reasoning
I would say that most failures of rationality were adaptive in the ancestral environments, but I wouldn’t say they all count as “motivated reasoning”.
Simple example: Seeing a snake in the grass, and responding as if there is a snake in the grass, in the presence of ambiguous stimuli that have only a 10% chance of being a snake, could well result in more surviving offspring than a more nuanced, likely slower, and closer-to-correct estimation of the probability there is a snake. But this is not a result of motivated reasoning where someone is advocating for their interests, it’s just a hack that our evolved brains have for keeping us alive using minimal energy and time for computation because calories were scarce and snakes sometimes moved quickly.
My understanding is that many failures of rationality are adaptive in this way—trading off getting the right answer (in terms of the answer that will cause there to be most offspring in future generations, not in terms of the answer that would count as “winning” by the lights of the human involved, necessarily—evolution doesn’t care a whit about whether I feel like I’ve won or lost or advanced what I see as my interests, as a result of something my brain’s biased towards or away from) against energy and time costs. One thing that could be different now is, the situations where we will starve are fewer, and the time we have to think before deciding what to do is often more.
Motivated reasoning is a specific relatively small subset of the biases our brains are subject to, not the main handle for biases in general, I think?
I would say that most failures of rationality were adaptive in the ancestral environments, but I wouldn’t say they all count as “motivated reasoning”.
Simple example: Seeing a snake in the grass, and responding as if there is a snake in the grass, in the presence of ambiguous stimuli that have only a 10% chance of being a snake, could well result in more surviving offspring than a more nuanced, likely slower, and closer-to-correct estimation of the probability there is a snake. But this is not a result of motivated reasoning where someone is advocating for their interests, it’s just a hack that our evolved brains have for keeping us alive using minimal energy and time for computation because calories were scarce and snakes sometimes moved quickly.
My understanding is that many failures of rationality are adaptive in this way—trading off getting the right answer (in terms of the answer that will cause there to be most offspring in future generations, not in terms of the answer that would count as “winning” by the lights of the human involved, necessarily—evolution doesn’t care a whit about whether I feel like I’ve won or lost or advanced what I see as my interests, as a result of something my brain’s biased towards or away from) against energy and time costs. One thing that could be different now is, the situations where we will starve are fewer, and the time we have to think before deciding what to do is often more.
Motivated reasoning is a specific relatively small subset of the biases our brains are subject to, not the main handle for biases in general, I think?