My own take re rationalization/motivated reasoning is that at the end of the day, no form of ethics can meaningfully slow it down if the person either can’t credibly commit to their future selves, or simply isn’t bound/want to follow ethical rules, so the motivated reasoning critique isn’t EA specific, but rather shows 2 things:
People are more selfish than they think themselves to be, and care less about virtues, so motivated reasoning is very easy.
We can’t credibly commit our future selves to do certain things, especially over long timeframes, and even when people do care about virtues, motivated reasoning still harms their thinking.
Motivated reasoning IMO is a pretty deep-seated problem within our own brains, and is probably unsolvable in the near term.
My own take re rationalization/motivated reasoning is that at the end of the day, no form of ethics can meaningfully slow it down if the person either can’t credibly commit to their future selves, or simply isn’t bound/want to follow ethical rules, so the motivated reasoning critique isn’t EA specific, but rather shows 2 things:
People are more selfish than they think themselves to be, and care less about virtues, so motivated reasoning is very easy.
We can’t credibly commit our future selves to do certain things, especially over long timeframes, and even when people do care about virtues, motivated reasoning still harms their thinking.
Motivated reasoning IMO is a pretty deep-seated problem within our own brains, and is probably unsolvable in the near term.