In my other message I said wealth doesn’t automatically make you more rational, because rationality is “systematic and agent-internal”. I don’t want to dismiss the problem you raised, though, because it gets us into deep waters pretty fast. So here’s a different response.
If I reliably use my money in a way that helps me achieve my ends, regardless of how much money I have, then giving me more money can make me more instrumentally rational, in the sense that I consistently win more often. Certainly it’s beyond dispute that being in such a situation has instrumental value, bracketing ‘rationality’. The reason we don’t normally think of this as an increase in instrumental rationality is that when we’re evaluating your likelihood of winning, the contributing factors we call ‘instrumental rationality’ are the set of win-relevant cognitive algorithms. Having money isn’t a cognitive algorithm, so it doesn’t qualify.
Why isn’t having money a cognitive algorithm? ‘Because it’s not in your skull’ isn’t a very satisfying answer. It’s not necessary: A species that exchanges wealth by exchanging memorized passcodes might make no use of objects outside of vocal utterances and memes. And it’s probably not sufficient: If I start making better economic decisions by relying more heavily on a calculator, it’s plausible that part of my increased instrumental rationality is distributed outside my skull, since part of it depends on the proper functioning of the calculator. Future inventions may do a lot more to blur the lines between cognitive enhancements inside and outside my brain.
So the more relevant point may be that receiving a payment is an isolated event, not a repeatable process. If you found a way to receive a steady paycheck, and reliably used that paycheck to get what you wanted more often, then I’d have a much harder time saying that you (= the you-money system) haven’t improved the instrumental rationality of your cognitive algorithms. It would be like trying to argue that your gene-activated biochemistry is agent-internal, but the aspects of your biochemistry that depend on your daily nootropic cocktail are agent-external. I despair of drawing clear lines on the issue.
Money isn’t a cognitive algorithm because it doesn’t actually help you decide what to do. You don’t generally use your money to make decisions. Having more money does put you in a better position where the available options are more favourable, but that’s not really the same thing.
Of course, if you spend that money on nootropics (or a calculator, I suppose), you might be said to have used money to improve your instrumental rationality!
In my other message I said wealth doesn’t automatically make you more rational, because rationality is “systematic and agent-internal”. I don’t want to dismiss the problem you raised, though, because it gets us into deep waters pretty fast. So here’s a different response.
If I reliably use my money in a way that helps me achieve my ends, regardless of how much money I have, then giving me more money can make me more instrumentally rational, in the sense that I consistently win more often. Certainly it’s beyond dispute that being in such a situation has instrumental value, bracketing ‘rationality’. The reason we don’t normally think of this as an increase in instrumental rationality is that when we’re evaluating your likelihood of winning, the contributing factors we call ‘instrumental rationality’ are the set of win-relevant cognitive algorithms. Having money isn’t a cognitive algorithm, so it doesn’t qualify.
Why isn’t having money a cognitive algorithm? ‘Because it’s not in your skull’ isn’t a very satisfying answer. It’s not necessary: A species that exchanges wealth by exchanging memorized passcodes might make no use of objects outside of vocal utterances and memes. And it’s probably not sufficient: If I start making better economic decisions by relying more heavily on a calculator, it’s plausible that part of my increased instrumental rationality is distributed outside my skull, since part of it depends on the proper functioning of the calculator. Future inventions may do a lot more to blur the lines between cognitive enhancements inside and outside my brain.
So the more relevant point may be that receiving a payment is an isolated event, not a repeatable process. If you found a way to receive a steady paycheck, and reliably used that paycheck to get what you wanted more often, then I’d have a much harder time saying that you (= the you-money system) haven’t improved the instrumental rationality of your cognitive algorithms. It would be like trying to argue that your gene-activated biochemistry is agent-internal, but the aspects of your biochemistry that depend on your daily nootropic cocktail are agent-external. I despair of drawing clear lines on the issue.
Because it isn’t an algorithm—a step-by-step procedure for calculations. (Source: Wikipedia.)
Money isn’t a cognitive algorithm because it doesn’t actually help you decide what to do. You don’t generally use your money to make decisions. Having more money does put you in a better position where the available options are more favourable, but that’s not really the same thing.
Of course, if you spend that money on nootropics (or a calculator, I suppose), you might be said to have used money to improve your instrumental rationality!