I think this is impossible to model very well without acknowledging the difference between private beliefs and public statements. Noticing hypocrisy in others does not require pointing it out (though it’s often a kindness, and sometimes a profitable move, to do so). Noticing hypocrisy in oneself (difference between internal beliefs and actions) can be addressed quietly rather than in public. Intentional hypocrisy in oneself (saying things that differ from your true beliefs) can be rational as well.
Generally speaking, it takes more effort and has pretty common epistemic failure modes to habitually lie to others. For tactical reasons, it’s nice to strive for honesty and public truth. But it’s not actually required for rationality.
There’s also a question of context and open but non-published truth. There are plenty of taboo truths (well, not Truth exactly, more like taboo useful models) that I’m willing to explore among small groups where nuance and acknowledgement of uncertainty is common, but very reluctant to get into in any public sphere where I don’t know and trust everyone involved.
Indeed, it is instrumentally useful for instrumental rationalists to portray themselves as epistemic rationalists. And so this is a common pattern in human politics—“[insert political coalition] care only about themselves, while [insert political coalition] are merely trying to spread truth” is one of the great political cliches for a reason. And because believing one’s own lies can be instrumentally useful, falsely believing oneself to have a holy devotion to the truth is a not-uncommon delusion.
I try to dissuade myself of this delusion.
There’s a subtle paradox here. Can you spot it?
He is trying to dissuade himself of the premise[X] that he is committed to the truth over socially useful falsehoods. But that premise[X] is itself socially useful to believe, and he claims it’s false, so disbelieving it would show that he does sometimes value the truth over socially useful falsehoods, contradicting the point.
More specifically, there are three possibilities here:
X is broadly true. He’s just wrong about X, but his statement that X is false is not socially motivated.
X is usually false, but his statements about X are a special case for some reason.
X is false, but his statement that X is false doesn’t contradict this because denying X is actually the socially useful thing, rather than affirming X. Lesswrong might be the kind of place where denying X (saying that you are committed to spreading socially useful falsehoods over the truth) actually gets you social credit, because readers interpret affirming X as the thing that gets you social credit, so denying it is interpreted as a signal that you are committed to saying the taboo truth (not-X) over what is socially useful (X), the exact opposite of what was stated. If true, this would be quite ironic. This interpretation is self-refuting in multiple ways, both logically (for not-X to be a “taboo truth”, X has to be false, which already rules out the conclusion of this line of reasoning) and causally (if everyone uses this logic, the premise that affirming X is socially useful becomes false, because denying X becomes the socially useful thing.) But that doesn’t mean readers couldn’t actually be drawing this conclusion without noticing the problems.
What if you feel the need to proclaim your taboo truth, but you aren’t quite sure what it is yet?
Then you will find yourself drawn to those situations in which you can proclaim basically any truth without fear of undue repercussion, and then once you will there you will find yourself pontificating.
Is this a good position to be in? Knowing what the taboo truth is in more detail might lead you to being forced to proclaim it earlier, or else give up on it, which is presumably why its detail has been occluded to you.
(incidentally, I at first read the title as “apply Rationalist Taboo to the word ‘truth’ ” which could also be an interesting exercise)
It is worth asking yourself why you are focused on this particular taboo truth. Why the Emperor’s clothes in particular? Why not the Pope’s supposed relationship with god? Why not the Cobbler’s guild’s propaganda on the safety hazards of unauthorized shoe repair? Is it possible you are just latching on to the first “Big Lie” you have encountered since you read a bunch about the virtue of truth?
Where? I’ve had a post draft about it forever, but figured it would just knee-jerk downvoted as non-compliant with the prevailing attitudes of epistemic rationality as a search for truth.
It’s often hard to get a good handle on a proposition if you don’t feel able to talk about it with people who disagree. I’ve offered in the past to privately go over any potentially dangerous ideas anyone thinks they’ve found.
Lies which coincide with the enforcement of taboos, or lies which misrepresent the character of people thereby destroying ideal speech situations, are never noble lies. This is not a hard case to make.
I agree with the sentiment here, but I’m not sure that the factual content of this is really significant since I think it’s an argument from the definition of “noble lie”
Thinking about it, it seems that if a person desires to point out a taboo truth without being exposed to the potential social/political repercussions, a safer way to do so would be to privately point out the taboo truth to another person who is unaware of said social/political repercussions, and encourage them to point it out instead.
You could have far more latitude speaking up and making your case on a sensitive topic someone else raised, even if you privately gave them the idea. Taboos can only really be enforced so long as common knowledge exists that this-is-a-taboo-we’re-enforcing; doesn’t take much to destroy common knowledge about any social assumption.
This is true, but if the goal is to minimize risk, it’s hard to do. It depends in part on the size, coordination, power, and zeal of those who proclaim the falsehood.
The Catholic Church didn’t bother prohibiting Copernicus’ ideas until 70+ years after his death, when the reformation was underway and the Church was less obviously dominant and could less easily tolerate internal dissent, and also after Kepler and Brahe both put heliocentrism on a more sound empirical and theoretical footing and associated it with the Protestants. That’s why (in addition to annoying and insulting those in power) Galileo was put under house arrest for speaking up, even if he technically published his Dialog with permission from the inquisition.
What size coalition do you need to quietly assemble, without being caught or anyone breaking the silence, with what kind of structure and voice and power, in order to speak up relatively safely, when trying to speak against a taboo truth? I obviously don’t expect anything like a closed-form analytic answer, but unless there’s some easily-describable-and-communicable heuristic (with plausible deniability since this is the kind of thing that can come to be looked down on due to association with taboo truths), we’re right in the realm of expecting scientists to have, on average, an unreasonable level of political savvy.
I think a key distinction here is any of this only helps if people care more about the truth of the issue at hand than whatever realpolitik considerations the issue has tangentially gotten pulled into. And yeah, absent “unreasonable levels of political savvy”, academics are mostly relying on academic issues usually being far enough from the icky world of politics to be openly discussed, at least outside of a few seriously diseased disciplines where the rot is well and truly set in. The powers that be seem to only care about the truth of an issue when it starts directly impinging on their day to day; people seem to find it noteworthy when this isn’t true of a given leader.
I don’t think this will ever be fully predictable. E.g. in the US I don’t think anyone really saw the magnitude of the backlash against election workers, academics, and security folks coming until it became headline news. And arguably that’s what a near-miss looks like.
In addition to the risk that you’ll feel bad about yourself for causing someone else to suffer for your truth, there’s a significant risk that they’ll do a much worse job than you, and make it easier for the truth to be denied.
Good point! If one wants to privately discuss a taboo truth, should one equally emphasize both the “taboo” as well as the “truth” of the matter? On first thought, ethically I would say yes.
I think this is impossible to model very well without acknowledging the difference between private beliefs and public statements. Noticing hypocrisy in others does not require pointing it out (though it’s often a kindness, and sometimes a profitable move, to do so). Noticing hypocrisy in oneself (difference between internal beliefs and actions) can be addressed quietly rather than in public. Intentional hypocrisy in oneself (saying things that differ from your true beliefs) can be rational as well.
Generally speaking, it takes more effort and has pretty common epistemic failure modes to habitually lie to others. For tactical reasons, it’s nice to strive for honesty and public truth. But it’s not actually required for rationality.
There’s also a question of context and open but non-published truth. There are plenty of taboo truths (well, not Truth exactly, more like taboo useful models) that I’m willing to explore among small groups where nuance and acknowledgement of uncertainty is common, but very reluctant to get into in any public sphere where I don’t know and trust everyone involved.
There’s a subtle paradox here. Can you spot it?
He is trying to dissuade himself of the premise[X] that he is committed to the truth over socially useful falsehoods. But that premise[X] is itself socially useful to believe, and he claims it’s false, so disbelieving it would show that he does sometimes value the truth over socially useful falsehoods, contradicting the point.
More specifically, there are three possibilities here:
X is broadly true. He’s just wrong about X, but his statement that X is false is not socially motivated.
X is usually false, but his statements about X are a special case for some reason.
X is false, but his statement that X is false doesn’t contradict this because denying X is actually the socially useful thing, rather than affirming X. Lesswrong might be the kind of place where denying X (saying that you are committed to spreading socially useful falsehoods over the truth) actually gets you social credit, because readers interpret affirming X as the thing that gets you social credit, so denying it is interpreted as a signal that you are committed to saying the taboo truth (not-X) over what is socially useful (X), the exact opposite of what was stated. If true, this would be quite ironic. This interpretation is self-refuting in multiple ways, both logically (for not-X to be a “taboo truth”, X has to be false, which already rules out the conclusion of this line of reasoning) and causally (if everyone uses this logic, the premise that affirming X is socially useful becomes false, because denying X becomes the socially useful thing.) But that doesn’t mean readers couldn’t actually be drawing this conclusion without noticing the problems.
What if you feel the need to proclaim your taboo truth, but you aren’t quite sure what it is yet?
Then you will find yourself drawn to those situations in which you can proclaim basically any truth without fear of undue repercussion, and then once you will there you will find yourself pontificating.
Is this a good position to be in? Knowing what the taboo truth is in more detail might lead you to being forced to proclaim it earlier, or else give up on it, which is presumably why its detail has been occluded to you.
(incidentally, I at first read the title as “apply Rationalist Taboo to the word ‘truth’ ” which could also be an interesting exercise)
I’ve talked loudly about lots of taboo truths...
Where? I’ve had a post draft about it forever, but figured it would just knee-jerk downvoted as non-compliant with the prevailing attitudes of epistemic rationality as a search for truth.
It’s often hard to get a good handle on a proposition if you don’t feel able to talk about it with people who disagree. I’ve offered in the past to privately go over any potentially dangerous ideas anyone thinks they’ve found.
Lies which coincide with the enforcement of taboos, or lies which misrepresent the character of people thereby destroying ideal speech situations, are never noble lies. This is not a hard case to make.
I agree with the sentiment here, but I’m not sure that the factual content of this is really significant since I think it’s an argument from the definition of “noble lie”
Thinking about it, it seems that if a person desires to point out a taboo truth without being exposed to the potential social/political repercussions, a safer way to do so would be to privately point out the taboo truth to another person who is unaware of said social/political repercussions, and encourage them to point it out instead.
Are you suggesting just letting someone oblivious take the fall or am I misunderstanding?
You could have far more latitude speaking up and making your case on a sensitive topic someone else raised, even if you privately gave them the idea. Taboos can only really be enforced so long as common knowledge exists that this-is-a-taboo-we’re-enforcing; doesn’t take much to destroy common knowledge about any social assumption.
This is true, but if the goal is to minimize risk, it’s hard to do. It depends in part on the size, coordination, power, and zeal of those who proclaim the falsehood.
The Catholic Church didn’t bother prohibiting Copernicus’ ideas until 70+ years after his death, when the reformation was underway and the Church was less obviously dominant and could less easily tolerate internal dissent, and also after Kepler and Brahe both put heliocentrism on a more sound empirical and theoretical footing and associated it with the Protestants. That’s why (in addition to annoying and insulting those in power) Galileo was put under house arrest for speaking up, even if he technically published his Dialog with permission from the inquisition.
What size coalition do you need to quietly assemble, without being caught or anyone breaking the silence, with what kind of structure and voice and power, in order to speak up relatively safely, when trying to speak against a taboo truth? I obviously don’t expect anything like a closed-form analytic answer, but unless there’s some easily-describable-and-communicable heuristic (with plausible deniability since this is the kind of thing that can come to be looked down on due to association with taboo truths), we’re right in the realm of expecting scientists to have, on average, an unreasonable level of political savvy.
I think a key distinction here is any of this only helps if people care more about the truth of the issue at hand than whatever realpolitik considerations the issue has tangentially gotten pulled into. And yeah, absent “unreasonable levels of political savvy”, academics are mostly relying on academic issues usually being far enough from the icky world of politics to be openly discussed, at least outside of a few seriously diseased disciplines where the rot is well and truly set in. The powers that be seem to only care about the truth of an issue when it starts directly impinging on their day to day; people seem to find it noteworthy when this isn’t true of a given leader.
I don’t think this will ever be fully predictable. E.g. in the US I don’t think anyone really saw the magnitude of the backlash against election workers, academics, and security folks coming until it became headline news. And arguably that’s what a near-miss looks like.
Yes, “taking the risk” was what I had more in mind, but essentially so.
In addition to the risk that you’ll feel bad about yourself for causing someone else to suffer for your truth, there’s a significant risk that they’ll do a much worse job than you, and make it easier for the truth to be denied.
Good point! If one wants to privately discuss a taboo truth, should one equally emphasize both the “taboo” as well as the “truth” of the matter? On first thought, ethically I would say yes.