I reject the framing of truth vs winning. Instead, I propose that only winning matters. Truth has no terminal value.
It does, however, have enormous instrumental value. For this reason, I support the norm always to tell the truth even if it appears as if the consequences are net negative – with the reasoning that they probably aren’t, at least not in expectation. This is so in part because truth feels extremely important to many of us, which means that having such a norm in place is highly beneficial.
The other response is much more interesting, arguing that appeals to consequences are generally bad, and that meta-level considerations mean we should generally speak the truth even if the immediate consequences are bad. I find this really interesting because it is ultimately about infohazards: those rare cases where there is a conflict between epistemic and instrumental rationality. Typically, we believe that having more truth (via epistemic rationality) is a positive trait that allows you to “win” more (thus aligning with instrumental rationality). But when more truth becomes harmful, which do we preference: truth, or winning?
The keyword here is “immediate” [emphasis added], which you drop by the end. I agree with the first part of this paragraph but disagree with the final sentence. Instead, my question would have been, “but when more truth appears to become harmful, how do we balance the immediate consequences against the long term/fuzzy/uncertain but potentially enormous consequences of violating the truth norm?”
I read jimrandomh’s comment as reasoning from this framework (rather than arguing that we should assign truth terminal value), but this might be confirmation bias.
I also support the general norm to default to truth. But I do believe there are cases where the negative consequences of truth become so severe and immediate that it is reasonable to not do so in favour of winning. The bar for that should be very high, but not unreachable.
I reject the framing of truth vs winning. Instead, I propose that only winning matters. Truth has no terminal value.
It does, however, have enormous instrumental value. For this reason, I support the norm always to tell the truth even if it appears as if the consequences are net negative – with the reasoning that they probably aren’t, at least not in expectation. This is so in part because truth feels extremely important to many of us, which means that having such a norm in place is highly beneficial.
The keyword here is “immediate” [emphasis added], which you drop by the end. I agree with the first part of this paragraph but disagree with the final sentence. Instead, my question would have been, “but when more truth appears to become harmful, how do we balance the immediate consequences against the long term/fuzzy/uncertain but potentially enormous consequences of violating the truth norm?”
I read jimrandomh’s comment as reasoning from this framework (rather than arguing that we should assign truth terminal value), but this might be confirmation bias.
I also support the general norm to default to truth. But I do believe there are cases where the negative consequences of truth become so severe and immediate that it is reasonable to not do so in favour of winning. The bar for that should be very high, but not unreachable.