I find that there is often a conflict between a motivation to speak only the truth and a motivation to successfully communicate as close approximations to the most relevant truths as constraints of time, intelligence and cultural conversational conventions allow.
Eli, you can get away with wearing whatever buttons you want because you can back-up all your claims by pointing out your writings, works etc to skeptical people.
But say I am a rationalist and I prefer honesty over dishonesty. And say I am trying to win someone’s confidence/trust/love/respect and I believe in ‘rationalists should win’ principle. And this person doesn’t necessarily care about rationality or honesty other than to the extent that was passed on to her by the prevailing social norms in her community/church etc.
Moreover she doesn’t see me as some ultra-rationalist guy and there is nothing I can do to prove otherwise, short of saying, hey check out all the websites I browse everyday or hey see all the books I have.
Now, when I talk to her, I twist the truth (or lie outright) to make sure I send a friendly signal to get what I want.
If I am talking to some person I’ve known for years, still, I’ll probably calibrate my words to send a message that I know would be received in a way I want it to be received to eventually get what I want.
My gut feeling is that this way of thinking is surely not right, but why? It surely is the ‘less wrong’ way in some situations. So, does it just boil down to personal preferences where the line should be drawn? I think so.
I think the issue is with the “get what I want” part. Isn’t this treating people as a means to an end, instead of treating them as an end in and of itself?* (I think that Kant would not be happy—though I don’t know of anything that has been written on lesswrong about this.)
If you are talking to another person and you are trying to convince them to adopt a certain view of you, that is not what I would call truth-oriented. So, whether you specifically lie, omit, or whatever; it’s already secondary. If your goal is to have an honest interaction with another being, I don’t think you can in that interaction want to edit their perception of you (apart from misunderstandings etc).
I’d say that the way you achieve your goal is to become what you want to be seen as. This is, of course, harder than just lying, but in a way it takes less effort, too. Plus, you avoid another important pitfall I could see here: Lying to yourself about wanting a connection with a person who doesn’t share your values. If you have to lie to fit in with them, maybe not fitting in with them is a good thing, and you should pay attention to that. In this way, the impulse to lie may be similarly useful as the tiny voice telling you that you are confused.
(The following is just about the effort it takes to lie vs truth. Not really required for the core idea, read if you wish^^) Imagine what insane effort it would take to lie all the time but try to be perceived as being honest! While “just” being honest is hard in a different way, though on subtler and subtler levels, I at least was freed of a lot of the mental overhead that lying brings with it. (Sure, part of that was replaced by the mental habits of self-checking, but still, way less. I don’t have to worry about what I may have said at some point if I don’t remember. I will see what I would say now, and unless I acquired new information or insight, this will probably approximate what I said then. If I am also honest about this process, my self-perceived fault of not perfect memory isn’t too bad anymore. This can never work with lying, because you need to keep tabs on what you told whom, how they may have gained additional information, etc.)
*(The fact that you specified the gender of the other person also implies a certain degree of “means to an end” to me (yes, even without knowing your gender) unless you are talking about one specific situation and nothing else. But that may just as well be wrong.)
I think the problem is that lying to other people will tend to reinforce lying to yourself and to others. Your brain can’t just separate the circumstances like that. Rationalists win when their hardware lets them—and our hardware is very limited.
Definitely. There is a significant risk in failing to communicate accurately by deciding that honesty is all we are obligated to do. This seems inconsistent with the ideal that rationalists should win, in this case winning over the difficulties of accurate communication, rather than simply trying virtuously.
More broadly, though, there is an ambiguity about what exactly honesty really means. After all as Douglas Adams points out, speaking the truth is not literally what people mean when they tell each other to be honest—for one thing this is neither a sane or a terminating request. I suspect this is one of those cases where the graceful failure of the concept is socially very useful, and so the ideal is useful, but over achievement is not necessarilly any better than under achievement (at least not in societal terms).
I wouldn’t say “trying virtuously,” though maybe that’s right. I definitely wouldn’t talk about a “motivation to speak only the truth.” It seems so rigid that I would call it a ritual or a superstition, a sense that there is only one correct thing that can be said.
Perhaps the problem is that the (unconscious) goal is not to communicate, but to show off to third parties, or even to make the listener feel stupid?
That’s teaching for you, the raw truth of the world can be both difficult to understand in the context of what you already ‘know’ (Religion → Evolution) or difficult to understand in its own right (Quantum physics).
This reminds me of “Lies to Humans” as Hex, the thinking machine of Discworld, where Hex tells the Wizards the ‘truth’ of something, coached in things they understand to basically shut them up, rather than to actually tell them what is really happening.
In general, a person cannot jump from any preconceived notion of how something is to the (possibly subjective!) truth. Instead, to teach you tell lesser and lesser lies, which in the best case, may simply be more and more accurate approximations of the truth. Throughout, you the teacher, have been as honest as to the learner as you can be.
But when someone has a notion of something that is wrong enough, I can see these steps as, in themselves, could contain falsehood which is not an approximation of the truth itself. Is this honest? To teach a flat-Earther the world is round, perhaps a step is to consider the world being convex, so as to explain the ‘ships over the horizon disappear’.
If your goal is to get someones understanding closer to the truth, it may be rational, but the steps you take, the things you teach, might not be honest.
To teach a flat-Earther the world is round, perhaps a step is to consider the world being convex, so as to explain the ‘ships over the horizon disappear’.
Only nitpicking, and I do like the example, but ‘the world is convex’ is actually less false than ‘the world is round’.
I find that there is often a conflict between a motivation to speak only the truth and a motivation to successfully communicate as close approximations to the most relevant truths as constraints of time, intelligence and cultural conversational conventions allow.
Eli, you can get away with wearing whatever buttons you want because you can back-up all your claims by pointing out your writings, works etc to skeptical people.
But say I am a rationalist and I prefer honesty over dishonesty. And say I am trying to win someone’s confidence/trust/love/respect and I believe in ‘rationalists should win’ principle. And this person doesn’t necessarily care about rationality or honesty other than to the extent that was passed on to her by the prevailing social norms in her community/church etc. Moreover she doesn’t see me as some ultra-rationalist guy and there is nothing I can do to prove otherwise, short of saying, hey check out all the websites I browse everyday or hey see all the books I have.
Now, when I talk to her, I twist the truth (or lie outright) to make sure I send a friendly signal to get what I want.
If I am talking to some person I’ve known for years, still, I’ll probably calibrate my words to send a message that I know would be received in a way I want it to be received to eventually get what I want.
My gut feeling is that this way of thinking is surely not right, but why? It surely is the ‘less wrong’ way in some situations. So, does it just boil down to personal preferences where the line should be drawn? I think so.
I think the issue is with the “get what I want” part. Isn’t this treating people as a means to an end, instead of treating them as an end in and of itself?* (I think that Kant would not be happy—though I don’t know of anything that has been written on lesswrong about this.)
If you are talking to another person and you are trying to convince them to adopt a certain view of you, that is not what I would call truth-oriented. So, whether you specifically lie, omit, or whatever; it’s already secondary. If your goal is to have an honest interaction with another being, I don’t think you can in that interaction want to edit their perception of you (apart from misunderstandings etc).
I’d say that the way you achieve your goal is to become what you want to be seen as. This is, of course, harder than just lying, but in a way it takes less effort, too.
Plus, you avoid another important pitfall I could see here: Lying to yourself about wanting a connection with a person who doesn’t share your values. If you have to lie to fit in with them, maybe not fitting in with them is a good thing, and you should pay attention to that. In this way, the impulse to lie may be similarly useful as the tiny voice telling you that you are confused.
(The following is just about the effort it takes to lie vs truth. Not really required for the core idea, read if you wish^^)
Imagine what insane effort it would take to lie all the time but try to be perceived as being honest! While “just” being honest is hard in a different way, though on subtler and subtler levels, I at least was freed of a lot of the mental overhead that lying brings with it. (Sure, part of that was replaced by the mental habits of self-checking, but still, way less. I don’t have to worry about what I may have said at some point if I don’t remember. I will see what I would say now, and unless I acquired new information or insight, this will probably approximate what I said then. If I am also honest about this process, my self-perceived fault of not perfect memory isn’t too bad anymore. This can never work with lying, because you need to keep tabs on what you told whom, how they may have gained additional information, etc.)
*(The fact that you specified the gender of the other person also implies a certain degree of “means to an end” to me (yes, even without knowing your gender) unless you are talking about one specific situation and nothing else. But that may just as well be wrong.)
I think the problem is that lying to other people will tend to reinforce lying to yourself and to others. Your brain can’t just separate the circumstances like that. Rationalists win when their hardware lets them—and our hardware is very limited.
Definitely. There is a significant risk in failing to communicate accurately by deciding that honesty is all we are obligated to do. This seems inconsistent with the ideal that rationalists should win, in this case winning over the difficulties of accurate communication, rather than simply trying virtuously.
More broadly, though, there is an ambiguity about what exactly honesty really means. After all as Douglas Adams points out, speaking the truth is not literally what people mean when they tell each other to be honest—for one thing this is neither a sane or a terminating request. I suspect this is one of those cases where the graceful failure of the concept is socially very useful, and so the ideal is useful, but over achievement is not necessarilly any better than under achievement (at least not in societal terms).
I wouldn’t say “trying virtuously,” though maybe that’s right. I definitely wouldn’t talk about a “motivation to speak only the truth.” It seems so rigid that I would call it a ritual or a superstition, a sense that there is only one correct thing that can be said.
Perhaps the problem is that the (unconscious) goal is not to communicate, but to show off to third parties, or even to make the listener feel stupid?
A good working definition might be “attempting to communicate in a way that makes the recipients map match the territory as closely as is reasonable.”
That’s teaching for you, the raw truth of the world can be both difficult to understand in the context of what you already ‘know’ (Religion → Evolution) or difficult to understand in its own right (Quantum physics).
This reminds me of “Lies to Humans” as Hex, the thinking machine of Discworld, where Hex tells the Wizards the ‘truth’ of something, coached in things they understand to basically shut them up, rather than to actually tell them what is really happening.
In general, a person cannot jump from any preconceived notion of how something is to the (possibly subjective!) truth. Instead, to teach you tell lesser and lesser lies, which in the best case, may simply be more and more accurate approximations of the truth. Throughout, you the teacher, have been as honest as to the learner as you can be.
But when someone has a notion of something that is wrong enough, I can see these steps as, in themselves, could contain falsehood which is not an approximation of the truth itself. Is this honest? To teach a flat-Earther the world is round, perhaps a step is to consider the world being convex, so as to explain the ‘ships over the horizon disappear’.
If your goal is to get someones understanding closer to the truth, it may be rational, but the steps you take, the things you teach, might not be honest.
Only nitpicking, and I do like the example, but ‘the world is convex’ is actually less false than ‘the world is round’.