Seems fairly uncontroversial to me, but that’s likely because it stays far-mode. If you get specific and near-mode, I suspect you’ll stir up some disagreement. Leave aside which beliefs you’d rather other people have or not have—that’s a separate dark arts topic. For your own goal-achievement-ability, which true beliefs are you better off not having?
I completely agree that I have limited resources and need to prioritize which beliefs are important enough to spend resources on. I far less agree that true beliefs (in the paying rent sense of the word, those which have correct conditional probability assignments to your potential actions) ever have negative value.
Seems fairly uncontroversial to me, but that’s likely because it stays far-mode. If you get specific and near-mode, I suspect you’ll stir up some disagreement. Leave aside which beliefs you’d rather other people have or not have—that’s a separate dark arts topic. For your own goal-achievement-ability, which true beliefs are you better off not having?
I completely agree that I have limited resources and need to prioritize which beliefs are important enough to spend resources on. I far less agree that true beliefs (in the paying rent sense of the word, those which have correct conditional probability assignments to your potential actions) ever have negative value.
Nick Bostrom suggests some examples in his “Information Hazards” paper.
That wasn’t nearly as exciting as it sounded.