Instrumental rationality is the practical application of epistemic rationality, “winning” being the criterion of whether you did it right. Is there anything that can be said about intrumental rationality at that level, rather than exhibiting exemplary particular cases?
That part of the wiki page was written in this edit
Thanks. Fixed. If someone wants to include discussion about how instrumental and epistemic rationality are related they may consider creating an additional subheading for that purpose. The ‘instrumental rationality’ section needs to be a simple definition of what the phrase refers to.
I don’t think I agree with the wiki. Epistemic rationality is not a special case of instrumental rationality, like a car is a special case of a vehicle; it is an essential component, like an engine is of a car.
Expressing that by saying that epistemic rationality is instrumentally rational for instrumental rationality is cute, but doesn’t justify the wiki’s statement. “In a sense”, it says, but not an important one.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it. Personally, I value the truth because it is a massive pre-requisite for doing good in the world (although I do tend to value it a bit more intrinsically). But if Epistemic Rationality didn’t help me be instrumentally rational, then I wouldn’t value it half as much. I want to win.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it.
Better, instrumentally, to learn to handle the truth. Ignorance and dullness are not qualities to be cultivated, however fortuitously useful on occasion it might be to not know something, or be unable to notice an implication of what you do know.
But if Epistemic Rationality didn’t help me be instrumentally rational
If it doesn’t, you’re doing it wrong. This is the entire point of LessWrong.
Better, instrumentally, to learn to handle the truth.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy that produces virtual paperclips and doesn’t care whether they are in the real or virtual world, so it doesn’t want/need to distinguish between them.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy
I really do not care about hypothetical entities that have the goal of being ignorant, especially constructions like wireheaded clippies. It’s generally agreed here that wireheading is a failure mode. So is the valorisation of ignorance by romanticism.
It obviously is. If it is part of your utility function to value truth then seeking truths will be instrumentally rational. ← This is the special case.
Epistemic rationality is just one of many means available for winning. There will be trade offs between these means, first in terms of opportunity costs, but second where being employing the means is a handicap in a situation. Under totalitarian theocracy, it’s likely instrumentally rational to believe in the Great Gazoo with the rest of them. In general, there are costs in not aligning your beliefs to the society you live in, even when those beliefs are false. Whether the benefit of more accurate prediction from epistemic rationality outweighs those costs is a factual matter of the situation.
Epistemic rationality is just one of many means available for winning. There will be trade offs between these means
I am not interested in these tradeoffs, or in whatever imagined situations where it would hypothetically be better to be dull and ignorant. If LessWrong is about anything, it is about epistemic rationality and its employment for instrumental rationality.
I am not interested in these tradeoffs, or in whatever imagined situations where it would hypothetically be better to be dull and ignorant.
I’m interested. I’m even interested in what aspect of instrumental rationality is served by renouncing epistemic rationality on the question of the limits of epistemic rationality to serve instrumental ends.
Instrumental rationality is the practical application of epistemic rationality, “winning” being the criterion of whether you did it right. Is there anything that can be said about intrumental rationality at that level, rather than exhibiting exemplary particular cases?
Interestingly, the wiki turns it around and says that epistemic rationality is a special case of instrumental rationality.
I wonder who wrote the wiki page. The claim is controversial. I’d say that the article would be better without it.
That part of the wiki page was written in this edit
Thanks. Fixed. If someone wants to include discussion about how instrumental and epistemic rationality are related they may consider creating an additional subheading for that purpose. The ‘instrumental rationality’ section needs to be a simple definition of what the phrase refers to.
I don’t think I agree with the wiki. Epistemic rationality is not a special case of instrumental rationality, like a car is a special case of a vehicle; it is an essential component, like an engine is of a car.
Expressing that by saying that epistemic rationality is instrumentally rational for instrumental rationality is cute, but doesn’t justify the wiki’s statement. “In a sense”, it says, but not an important one.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it. Personally, I value the truth because it is a massive pre-requisite for doing good in the world (although I do tend to value it a bit more intrinsically). But if Epistemic Rationality didn’t help me be instrumentally rational, then I wouldn’t value it half as much. I want to win.
Better, instrumentally, to learn to handle the truth. Ignorance and dullness are not qualities to be cultivated, however fortuitously useful on occasion it might be to not know something, or be unable to notice an implication of what you do know.
If it doesn’t, you’re doing it wrong. This is the entire point of LessWrong.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy that produces virtual paperclips and doesn’t care whether they are in the real or virtual world, so it doesn’t want/need to distinguish between them.
I really do not care about hypothetical entities that have the goal of being ignorant, especially constructions like wireheaded clippies. It’s generally agreed here that wireheading is a failure mode. So is the valorisation of ignorance by romanticism.
It obviously is. If it is part of your utility function to value truth then seeking truths will be instrumentally rational. ← This is the special case.
Sorry, this whole subthread has devolved into idiotic word games.
Epistemic rationality is just one of many means available for winning. There will be trade offs between these means, first in terms of opportunity costs, but second where being employing the means is a handicap in a situation. Under totalitarian theocracy, it’s likely instrumentally rational to believe in the Great Gazoo with the rest of them. In general, there are costs in not aligning your beliefs to the society you live in, even when those beliefs are false. Whether the benefit of more accurate prediction from epistemic rationality outweighs those costs is a factual matter of the situation.
I am not interested in these tradeoffs, or in whatever imagined situations where it would hypothetically be better to be dull and ignorant. If LessWrong is about anything, it is about epistemic rationality and its employment for instrumental rationality.
I’m interested. I’m even interested in what aspect of instrumental rationality is served by renouncing epistemic rationality on the question of the limits of epistemic rationality to serve instrumental ends.