“Maximizing truth” doesn’t make any sense. You can’t maximize truth. You can improve your knowlege of the truth, but the truth itself is independent of your brain state.
In any case, when is untruth more instrumental to your utility function than truth?
Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
I think it’s fairly obvious that “maximizing truth” meant “maximizing the correlation between my beliefs and truth”.
Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.
And before you complain that this doesn’t make any sense either, let me spell out that is an estimate of the probability that the strategy “pursue truth first, happiness second” yields, on average, more hedons than “pursue happiness using the current set of beliefs”.
“Maximizing truth” doesn’t make any sense. You can’t maximize truth. You can improve your knowlege of the truth, but the truth itself is independent of your brain state.
In any case, when is untruth more instrumental to your utility function than truth? Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
I think it’s fairly obvious that “maximizing truth” meant “maximizing the correlation between my beliefs and truth”.
Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.
And before you complain that this doesn’t make any sense either, let me spell out that is an estimate of the probability that the strategy “pursue truth first, happiness second” yields, on average, more hedons than “pursue happiness using the current set of beliefs”.