First, it seems to me that this is mainly a debate over the definition of instrumental rationality. And I suspect the reason people want to have this debate is so they can figure out whether they count as especially “instrumentally rational” or not.
The simplest definition of “instrumentally rational” I can think of is “a person is instrumentally rational to the extent they are good at acting to achieve their goals”. Thus somebody with akrasia would not qualify as a very instrumentally rational under this simple definition. Your definition amounts to drawing the boundary of agency differently so it doesn’t end at the person’s body, but slices through their brain between them and their akrasia. I don’t much like this definition because it seems as though knowing what the best thing to do (as opposed to doing it) should be in the domain of epistemic rationality, not instrumental rationality.
I would prefer to draw the line at the person’s entire brain, so that someone who had a better intuitive understanding of probability theory might qualify as being more instrumentally rational, but an especially wealthy or strong person would not, even if those characteristics made them better at acting to achieve their goals.
The post actually seems to equivocate between epistemic and instrumental rationality—note the use of “rational beliefs” and “rational choices” in the same sentence.
I think it’s easy to defend a much weaker version of the thesis, that instrumental rationality maximizes expected utility, not utility of results.
Here is a thought experiment that illustrates the slipperiness of instrumental rationality: Let’s say there is a world everyone is respected according to their (ELO ranked) chess ability and nothing else. In this world your ability to make friends, earn a high salary, etc. all depend on how well you play chess. Should somebody who is better at playing chess be considered more instrumentally rational in this world?
My definition says yes, because chess playing is an ability that resides in the brain. If you define instrumental rationality as “ability to make choices with high expected value” or some such, that definition says yes as well because playing chess is a series of choices. You can imagine a hypothetical Flatland-weird universe where making good choices depends more on the kind of skills required to play chess and less on probabilistic reasoning, calculating expected values, etc. In this world the equivalent of Less Wrong discusses various chess openings and endgame problems in order to help members become more instrumentally rational.
First, it seems to me that this is mainly a debate over the definition of instrumental rationality. And I suspect the reason people want to have this debate is so they can figure out whether they count as especially “instrumentally rational” or not.
The simplest definition of “instrumentally rational” I can think of is “a person is instrumentally rational to the extent they are good at acting to achieve their goals”. Thus somebody with akrasia would not qualify as a very instrumentally rational under this simple definition. Your definition amounts to drawing the boundary of agency differently so it doesn’t end at the person’s body, but slices through their brain between them and their akrasia. I don’t much like this definition because it seems as though knowing what the best thing to do (as opposed to doing it) should be in the domain of epistemic rationality, not instrumental rationality.
I would prefer to draw the line at the person’s entire brain, so that someone who had a better intuitive understanding of probability theory might qualify as being more instrumentally rational, but an especially wealthy or strong person would not, even if those characteristics made them better at acting to achieve their goals.
Related thread on word usage: http://lesswrong.com/lw/96n/meta_rational_vs_optimized/
The post actually seems to equivocate between epistemic and instrumental rationality—note the use of “rational beliefs” and “rational choices” in the same sentence.
I think it’s easy to defend a much weaker version of the thesis, that instrumental rationality maximizes expected utility, not utility of results.
Here is a thought experiment that illustrates the slipperiness of instrumental rationality: Let’s say there is a world everyone is respected according to their (ELO ranked) chess ability and nothing else. In this world your ability to make friends, earn a high salary, etc. all depend on how well you play chess. Should somebody who is better at playing chess be considered more instrumentally rational in this world?
My definition says yes, because chess playing is an ability that resides in the brain. If you define instrumental rationality as “ability to make choices with high expected value” or some such, that definition says yes as well because playing chess is a series of choices. You can imagine a hypothetical Flatland-weird universe where making good choices depends more on the kind of skills required to play chess and less on probabilistic reasoning, calculating expected values, etc. In this world the equivalent of Less Wrong discusses various chess openings and endgame problems in order to help members become more instrumentally rational.