Winning vs Truth – Infohazard Trade-Offs

[Cross-posted from Grand Unified Crazy.]

This post on the cred­i­bil­ity of the CDC has sparked a great deal of dis­cus­sion on the ethics of posts like it. Some peo­ple claim that the post it­self is harm­ful, ar­gu­ing that any­thing which re­duces trust in the CDC will likely kill peo­ple as they ig­nore or re­ject im­por­tant ad­vice for deal­ing with SARS-CoV-2 and (in the long-run) other is­sues like vac­ci­na­tion. This ar­gu­ment has been met with two very differ­ent re­sponses.

One re­sponse has been to ar­gue that the CDC’s ad­vice is so bad that re­duc­ing trust in it will ac­tu­ally have a net pos­i­tive effect in the long run. This is an ul­ti­mately em­piri­cal ques­tion which some­body should prob­a­bly ad­dress, but I do not have the skills or in­ter­est to at­tempt that.

The other re­sponse is much more in­ter­est­ing, ar­gu­ing that ap­peals to con­se­quences are gen­er­ally bad, and that meta-level con­sid­er­a­tions mean we should gen­er­ally speak the truth even if the im­me­di­ate con­se­quences are bad. I find this re­ally in­ter­est­ing be­cause it is ul­ti­mately about in­fo­haz­ards: those rare cases where there is a con­flict be­tween epistemic and in­stru­men­tal ra­tio­nal­ity. Typ­i­cally, we be­lieve that hav­ing more truth (via epistemic ra­tio­nal­ity) is a pos­i­tive trait that al­lows you to “win” more (thus al­ign­ing with in­stru­men­tal ra­tio­nal­ity). But when more truth be­comes harm­ful, which do we prefer­ence: truth, or win­ning?

Some peo­ple will just de­cide to value truth more than win­ning as an ax­iom of their value sys­tem. But for most of us, ul­ti­mately I think this also boils down to an em­piri­cal ques­tion of just how bad “not win­ning” will end up be­ing. It’s easy to see that for suffi­ciently se­vere cases, nat­u­ral se­lec­tion takes over: any meme/​per­son/​thing that prefers truth over win­ning in those cases will die out, to be re­placed by memes/​peo­ple/​things that choose to win. I per­son­ally will pre­fer win­ning in those cases. It’s also true that most of the time, truth ac­tu­ally helps you win in the long run. We should prob­a­bly re­ject un­true claims even if they provide a small amount of ex­tra short-term win­ning, since in the long run hav­ing an un­true be­lief is likely to pre­vent us from win­ning in ways we can’t pre­dict.

Figur­ing out where the cut-over point lies be­tween truth and win­ning seems non-triv­ial. Based on my ex­am­ples above we can de­rive two sim­ple heuris­tics to start off:

  • Pre­fer truth over win­ning by de­fault.

  • Pre­fer win­ning over truth if the cost of not win­ning is de­struc­tion of your­self or your com­mu­nity. (It’s in­ter­est­ing to note that this heuris­tic ar­guably already ap­plies to SARS-Cov-2, at least for some peo­ple in at-risk de­mo­graph­ics.)

What other heuris­tics do other peo­ple use for this ques­tion? How do they come out on the CDC post and SARS-CoV-2?