I like your write-up, very clear and accessible. You certainly have a gift for popularization, not just research. A rare combination.
I would just note upfront that
Reasoning well has little to do with what you’re reasoning towards.
and
Rationality of this kind is not about changing where you’re going, it’s about changing how far you can go.
are white lies, as you well know. It’s not unusual in the process of reasoning of how to best achieve your goal to find that the goal itself shifts or evaporates.
“How to best serve God” may result in deconversion.
“How to make my relationship with partner a happy one” may result in discovering that they are a narcissistic little shit I should run away from. Or that both of us should find other partners.
“How to help my neighborhood out of poverty” might become “How to make the most money” in order to donate as much as possible.
This goal-evaporation danger is rarely as extreme, but it is ubiquitous. Every goalpost shifts when you optimize your shot hard enough. Your analogy
Your deepest desires are not a burden, but a compass
is very apt: following it strictly helps you reach your destination, but does not mean your destination is there or has what you expected. Or won’t get you killed in the process.
In this essay you talk about Instrumental Rationality as if it were separate from Epistemic. It is not. The dangers of good reasoning ought to be noted upfront, the way MoR!Harry did to Draco, only more so. Hopefully you already plan to talk about it in one of your remaining three essays.
My stance on terminal values is “it’s possible to be wrong about what you deeply desire.” The person who deconverted through trying to figure out how to better serve God likely did so in the process of realizing they had deeper humanitarian values. Similarly with the person who tried to help their neigborhood out of poverty and become an EA.
This is in part why I said that reasoning well has “little” (instead of “nothing”) to do with what you’re reasoning towards. Similarly, in “it’s not about changing where you’re going,” I had no intention of equating “where you’re going” with “where you think you’re going” :-)
However, I agree that the default apparent connotation contains one doozy of a white lie.
Every goalpost shifts when you optimize your shot hard enough.
Optimize hard enough for “get the ball into the goal as fast as possible” and you explode the ball and drive its husk through the bodies of the defending team, and you don’t get asked to play football any more.
I like your write-up, very clear and accessible. You certainly have a gift for popularization, not just research. A rare combination.
I would just note upfront that
and
are white lies, as you well know. It’s not unusual in the process of reasoning of how to best achieve your goal to find that the goal itself shifts or evaporates.
“How to best serve God” may result in deconversion.
“How to make my relationship with partner a happy one” may result in discovering that they are a narcissistic little shit I should run away from. Or that both of us should find other partners.
“How to help my neighborhood out of poverty” might become “How to make the most money” in order to donate as much as possible.
This goal-evaporation danger is rarely as extreme, but it is ubiquitous. Every goalpost shifts when you optimize your shot hard enough. Your analogy
is very apt: following it strictly helps you reach your destination, but does not mean your destination is there or has what you expected. Or won’t get you killed in the process.
In this essay you talk about Instrumental Rationality as if it were separate from Epistemic. It is not. The dangers of good reasoning ought to be noted upfront, the way MoR!Harry did to Draco, only more so. Hopefully you already plan to talk about it in one of your remaining three essays.
Thanks!
You’ve caught me :-)
My stance on terminal values is “it’s possible to be wrong about what you deeply desire.” The person who deconverted through trying to figure out how to better serve God likely did so in the process of realizing they had deeper humanitarian values. Similarly with the person who tried to help their neigborhood out of poverty and become an EA.
This is in part why I said that reasoning well has “little” (instead of “nothing”) to do with what you’re reasoning towards. Similarly, in “it’s not about changing where you’re going,” I had no intention of equating “where you’re going” with “where you think you’re going” :-)
However, I agree that the default apparent connotation contains one doozy of a white lie.
Optimize hard enough for “get the ball into the goal as fast as possible” and you explode the ball and drive its husk through the bodies of the defending team, and you don’t get asked to play football any more.