I would draw a distinction within epistemic rationality between (A) wanting to know more true things, and (B) wanting to avoid believing false things.
Most of the examples given poke at a tension between learning, or potentially learning, or disseminating for others to learn, something new (so, type A epistemics according to the typology I just pulled from my nethers) versus the risks of instrumental harm incurred in the process.
Only example 3 really addresses type B by asking whether it’s better for Cansa’s friend to believe a falsehood if that will improve their prognosis. But the distinction between believing 51% and 49% is small enough for that to feel like a small and weak falsehood anyway.
I don’t know what your poll respondents had in mind, but if you asked me about my preference between epistemic/instrumental rationality, truth vs winning, my first thought would be a rather stronger type B example; I’d be asking myself whether I’d want to believe a fairly large falsehood in exchange for instrumental benefits. Which might be sidestepping part of the intent of the poll question, but also might explain some of the apparent discrepancy.
Yeah but type B and its many forms like placebo, nocebo, psychosis etc etc are already widely known and documented. I only included it begrudgingly for the sake of completeness and because someone was going to mention it in the comments (not sure why I made the effect-size so small). This post is not titled: ‘steelman of the pragmatist position’, otherwise I would have indeed focused more on the type B for which there are more real world examples. I wanted to think up some fun and strange thought experiments that might push peoples brain in directions they don’t usually go and consider new angles.
I would draw a distinction within epistemic rationality between (A) wanting to know more true things, and (B) wanting to avoid believing false things.
Most of the examples given poke at a tension between learning, or potentially learning, or disseminating for others to learn, something new (so, type A epistemics according to the typology I just pulled from my nethers) versus the risks of instrumental harm incurred in the process.
Only example 3 really addresses type B by asking whether it’s better for Cansa’s friend to believe a falsehood if that will improve their prognosis. But the distinction between believing 51% and 49% is small enough for that to feel like a small and weak falsehood anyway.
I don’t know what your poll respondents had in mind, but if you asked me about my preference between epistemic/instrumental rationality, truth vs winning, my first thought would be a rather stronger type B example; I’d be asking myself whether I’d want to believe a fairly large falsehood in exchange for instrumental benefits. Which might be sidestepping part of the intent of the poll question, but also might explain some of the apparent discrepancy.
Yeah but type B and its many forms like placebo, nocebo, psychosis etc etc are already widely known and documented. I only included it begrudgingly for the sake of completeness and because someone was going to mention it in the comments (not sure why I made the effect-size so small). This post is not titled: ‘steelman of the pragmatist position’, otherwise I would have indeed focused more on the type B for which there are more real world examples. I wanted to think up some fun and strange thought experiments that might push peoples brain in directions they don’t usually go and consider new angles.