I don’t think I agree with the wiki. Epistemic rationality is not a special case of instrumental rationality, like a car is a special case of a vehicle; it is an essential component, like an engine is of a car.
Expressing that by saying that epistemic rationality is instrumentally rational for instrumental rationality is cute, but doesn’t justify the wiki’s statement. “In a sense”, it says, but not an important one.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it. Personally, I value the truth because it is a massive pre-requisite for doing good in the world (although I do tend to value it a bit more intrinsically). But if Epistemic Rationality didn’t help me be instrumentally rational, then I wouldn’t value it half as much. I want to win.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it.
Better, instrumentally, to learn to handle the truth. Ignorance and dullness are not qualities to be cultivated, however fortuitously useful on occasion it might be to not know something, or be unable to notice an implication of what you do know.
But if Epistemic Rationality didn’t help me be instrumentally rational
If it doesn’t, you’re doing it wrong. This is the entire point of LessWrong.
Better, instrumentally, to learn to handle the truth.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy that produces virtual paperclips and doesn’t care whether they are in the real or virtual world, so it doesn’t want/need to distinguish between them.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy
I really do not care about hypothetical entities that have the goal of being ignorant, especially constructions like wireheaded clippies. It’s generally agreed here that wireheading is a failure mode. So is the valorisation of ignorance by romanticism.
It obviously is. If it is part of your utility function to value truth then seeking truths will be instrumentally rational. ← This is the special case.
I don’t think I agree with the wiki. Epistemic rationality is not a special case of instrumental rationality, like a car is a special case of a vehicle; it is an essential component, like an engine is of a car.
Expressing that by saying that epistemic rationality is instrumentally rational for instrumental rationality is cute, but doesn’t justify the wiki’s statement. “In a sense”, it says, but not an important one.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it. Personally, I value the truth because it is a massive pre-requisite for doing good in the world (although I do tend to value it a bit more intrinsically). But if Epistemic Rationality didn’t help me be instrumentally rational, then I wouldn’t value it half as much. I want to win.
Better, instrumentally, to learn to handle the truth. Ignorance and dullness are not qualities to be cultivated, however fortuitously useful on occasion it might be to not know something, or be unable to notice an implication of what you do know.
If it doesn’t, you’re doing it wrong. This is the entire point of LessWrong.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy that produces virtual paperclips and doesn’t care whether they are in the real or virtual world, so it doesn’t want/need to distinguish between them.
I really do not care about hypothetical entities that have the goal of being ignorant, especially constructions like wireheaded clippies. It’s generally agreed here that wireheading is a failure mode. So is the valorisation of ignorance by romanticism.
It obviously is. If it is part of your utility function to value truth then seeking truths will be instrumentally rational. ← This is the special case.
Sorry, this whole subthread has devolved into idiotic word games.