Another type of “fear of anti-akrasia techniques” that sometimes occurs, is fear/mistrust of what one’s own conscious decision-making process might goof up, if that decision-making process is abruptly given increased power. (This differs from Nominull’s description, because in this scenario you don’t specifically fear surfing the internet less, or any other specific foreseen change; you fear the effects of suddenly removing a system of internal checks and balances, and handing your internal reins over to a new and untested cognitive subsystem.)
Even if your consciously claimed preferences are your “real” preferences (which is not at all obvious, given that claimed preferences may be chosen for the purpose of affecting your self-image or your external social image, rather than for the purpose of choosing between future outcomes)...
… even in this case, there’s the additional problem that your consciously claimed “beliefs” may not be your actual anticipations, and, even if they are your actual anticipations, may be a worse model of the world than is the model implicit in our cultural action-patterns. A person who “believes” her actions will determine whether she spends eternity in heaven or hell, but who has ordinary levels of akrasia and mostly just does what the people around her are doing, is less harmed by her beliefs than she would be if she could actually act take the actions that her stated beliefs and preferences imply. Ditto for a person who believes a strange and unhealthy diet would be beneficial (but can’t seem to fully stick to the new diet), or who believes overconfidently that a particular particular peak oil scenario is “99% likely” (but takes some of his actions in a more ordinary manner anyhow).
If people were easily able to act on their consciously claimed beliefs and preferences, present levels of irrational beliefs might lead to considerably more disruption than they do. Conversely, if people had more reason to trust their “beliefs” and “preferences”, I wouldn’t be at all surprised if akrasia decreased.
This connection is one reason that learning to actually form reasonable beliefs (epistemic rationality), and learning to act in a manner that actually makes sense given one’s beliefs and preferences (overcoming “akrasia”), strike me as linked aspects of a single art.
Even in people whose conscious world models are basically sane (by our exacting LW standards), when they’re considering doing or planning for some weird, uncomfortable, and out-of-the-ordinary action seemingly justified by weighing costs and benefits, it seems to me that akrasia can sometimes be a rational stand-in for some considerations they don’t keep conscious track of, including but not limited to: reputation costs, willpower loss / ego depletion, other limits to worry, the inference that if one weird thing seems especially important now others are going to seem especially important in the future, the possibility that one might waste resources by not following through on a weird plan, the desirability of keeping one’s mind “cleaner” by keeping fewer chunks in one’s planning space, benefits of long-term happiness and of not associating unhappiness with rationality to oneself or to other people, benefits of having one’s actions and motivations make sense to other people, various self-image issues. There are going to be many unmodeled considerations in the other direction too, but I suspect they will normally be fewer.
It’s better to forego a 1-expected-util hare-brained scheme than let it distract you into a 20% chance of foregoing a 10-expected-util hare-brained scheme, or so Morgensternsai tells me.
On the other hand, I don’t want to hand people tools for mediocrity here; many times akrasia in these situations really is just irrational. I wish I had a better idea of when.
Another type of “fear of anti-akrasia techniques” that sometimes occurs, is fear/mistrust of what one’s own conscious decision-making process might goof up, if that decision-making process is abruptly given increased power. (This differs from Nominull’s description, because in this scenario you don’t specifically fear surfing the internet less, or any other specific foreseen change; you fear the effects of suddenly removing a system of internal checks and balances, and handing your internal reins over to a new and untested cognitive subsystem.)
It strikes me that this is similar to theists’ fear of taking God away and morality then being undetermined: they don’t trust themselves, or the rest of us. (But we knew that.) It’s interesting to consider it as a more general case of fear of thought about thought. The roots of anti-rationality get everywhere.
Another type of “fear of anti-akrasia techniques” that sometimes occurs, is fear/mistrust of what one’s own conscious decision-making process might goof up, if that decision-making process is abruptly given increased power. (This differs from Nominull’s description, because in this scenario you don’t specifically fear surfing the internet less, or any other specific foreseen change; you fear the effects of suddenly removing a system of internal checks and balances, and handing your internal reins over to a new and untested cognitive subsystem.)
Even if your consciously claimed preferences are your “real” preferences (which is not at all obvious, given that claimed preferences may be chosen for the purpose of affecting your self-image or your external social image, rather than for the purpose of choosing between future outcomes)...
… even in this case, there’s the additional problem that your consciously claimed “beliefs” may not be your actual anticipations, and, even if they are your actual anticipations, may be a worse model of the world than is the model implicit in our cultural action-patterns. A person who “believes” her actions will determine whether she spends eternity in heaven or hell, but who has ordinary levels of akrasia and mostly just does what the people around her are doing, is less harmed by her beliefs than she would be if she could actually act take the actions that her stated beliefs and preferences imply. Ditto for a person who believes a strange and unhealthy diet would be beneficial (but can’t seem to fully stick to the new diet), or who believes overconfidently that a particular particular peak oil scenario is “99% likely” (but takes some of his actions in a more ordinary manner anyhow).
If people were easily able to act on their consciously claimed beliefs and preferences, present levels of irrational beliefs might lead to considerably more disruption than they do. Conversely, if people had more reason to trust their “beliefs” and “preferences”, I wouldn’t be at all surprised if akrasia decreased.
This connection is one reason that learning to actually form reasonable beliefs (epistemic rationality), and learning to act in a manner that actually makes sense given one’s beliefs and preferences (overcoming “akrasia”), strike me as linked aspects of a single art.
Even in people whose conscious world models are basically sane (by our exacting LW standards), when they’re considering doing or planning for some weird, uncomfortable, and out-of-the-ordinary action seemingly justified by weighing costs and benefits, it seems to me that akrasia can sometimes be a rational stand-in for some considerations they don’t keep conscious track of, including but not limited to: reputation costs, willpower loss / ego depletion, other limits to worry, the inference that if one weird thing seems especially important now others are going to seem especially important in the future, the possibility that one might waste resources by not following through on a weird plan, the desirability of keeping one’s mind “cleaner” by keeping fewer chunks in one’s planning space, benefits of long-term happiness and of not associating unhappiness with rationality to oneself or to other people, benefits of having one’s actions and motivations make sense to other people, various self-image issues. There are going to be many unmodeled considerations in the other direction too, but I suspect they will normally be fewer.
It’s better to forego a 1-expected-util hare-brained scheme than let it distract you into a 20% chance of foregoing a 10-expected-util hare-brained scheme, or so Morgensternsai tells me.
On the other hand, I don’t want to hand people tools for mediocrity here; many times akrasia in these situations really is just irrational. I wish I had a better idea of when.
It strikes me that this is similar to theists’ fear of taking God away and morality then being undetermined: they don’t trust themselves, or the rest of us. (But we knew that.) It’s interesting to consider it as a more general case of fear of thought about thought. The roots of anti-rationality get everywhere.