obtaining the truth is the rationalists’ ultimate goal
Nope. It’s an instrumental goal. We just believe it to be very useful, because in nontrivial situations it is difficult to find a strategy to achieve X without having true beliefs about X.
Are there scenarios in which a rationalist should actively try to avoid the truth to maximize their possible utility?
Omega tells you: “Unless you start believing in horoscopes, I will torture all humans to death.” (Or, if making oneself believe something false is too difficult, then something like: “There is one false statement in your math textbook, and if you even find out which one it is, I will torture all humans to death.” In which case I would avoid looking at the textbook ever again.)
Option 2 is Omega will answer one question on absolutely any subject truthfully pertaining to our universe with no strings attached. You can ask about the laws governing the universe, the meaning of life, the origin of time and space, whatever and Omega will give you a absolutely truthful, knowledgeable answer.
I guess it would depend on how much would I trust myself to ask a question what could bring me even more benefit than option 1. For example: “What is the most likely way that I could become Omega-powerful without losing my values? (Most likely = relative to my current situation and abilities.)” Because a lucky answer on this one could be even better than the first option. -- So it comes to an estimate about whether such lucky answer exists, what is my probability to follow the strategy successfully if I get the answer, and what is my probability to ask the question correctly. Which I admit I don’t know.
Where truth is a terminal goal, it is a terminal goal. The fact that it is a often a useful as a means to some other goal does not contradict that. Cf: valuing money for itself, or for what you can do with it.
Your statements are perfectly correct. You’re probably being downvoted because people are assuming that you’re talking about your own values, and they don’t believe that you could “really” hold truth as your sole terminal goal.
Nope. It’s an instrumental goal. We just believe it to be very useful, because in nontrivial situations it is difficult to find a strategy to achieve X without having true beliefs about X.
Omega tells you: “Unless you start believing in horoscopes, I will torture all humans to death.” (Or, if making oneself believe something false is too difficult, then something like: “There is one false statement in your math textbook, and if you even find out which one it is, I will torture all humans to death.” In which case I would avoid looking at the textbook ever again.)
I guess it would depend on how much would I trust myself to ask a question what could bring me even more benefit than option 1. For example: “What is the most likely way that I could become Omega-powerful without losing my values? (Most likely = relative to my current situation and abilities.)” Because a lucky answer on this one could be even better than the first option. -- So it comes to an estimate about whether such lucky answer exists, what is my probability to follow the strategy successfully if I get the answer, and what is my probability to ask the question correctly. Which I admit I don’t know.
Where truth is a terminal goal, it is a terminal goal. The fact that it is a often a useful as a means to some other goal does not contradict that. Cf: valuing money for itself, or for what you can do with it.
Your statements are perfectly correct. You’re probably being downvoted because people are assuming that you’re talking about your own values, and they don’t believe that you could “really” hold truth as your sole terminal goal.