I think about this topic a lot, and I appreciate your dissent, particularly since it helped me organize my thoughts a little. That said, I think you’re almost completely wrong. The best way to get at the problem is probably to start with your examples. Not exactly in order, sorry.
The lesson here is likewise clear: If your actual personality isn’t good enough, pretend to be Flynn Rider to everyone at all times, with the sole carve-out being people who love you, like your mother or a princess. This works because people who love you will find your openness endearing, whereas everyone else will think you pathetic and use it against you.
Here’s a true story. I once met a lovely and intelligent woman who didn’t like that I’m a bit blunt and ruthlessly truthseeking. I didn’t stop being that way, and mainly for that reason we didn’t become romantically involved. A few months later I met a lovely, intelligent, reasonable, sometimes blunt, and open-minded woman who did like that I’m a bit blunt and ruthlessly truthseeking. We’ve been dating for 2.5 years now and I’m on balance very happy with how everything worked out.
A Paragon of Morality is out travelling, when he is beset by bandits. They demand he hand over his gold or they will kill him and take it from his corpse. This is not a decision-theoretic threat because the bandits value getting his gold more than they disprefer commiting murder, but would otherwise avoid the murder if possible. If he hands over all his gold he will lose all his gold. If he hands over all the gold in his pockets, neglects the extra he has hidden in his sock, and says “I have given you all my gold” in a sufficiently convincing tone of voice, then he will lose less than all his gold.
These isn’t Omega we’re dealing with here, they’re totally trickable by a moderately convincing performance. If he keeps some of the gold he can donate it to Givewell approves charities and save however many QALYs or whatever.
Does he have a moral obligation to lie?
He certainly doesn’t have a moral obligation to tell the truth. But a lot of moral obligations change when someone points a gun at you. For instance, it becomes morally permissible (though not necessarily feasible) to shoot at them, or to give up what money you must and later steal it back at the first available opportunity. To me, the truth is something precious, and lying is like stealing the truth; it’s permissible in some extreme and usually adversarial situations. With that said, I’m a bit of a rationalist dedicate/monk and I’d prefer to fight than lie—however I don’t think everyone is rationally or otherwise compelled to follow suit, for reasons that will be further explained.
A Normally Honest Man is applying for a job as a Widget Designer. He has many years of industry experience in Widget Engineering. He has memorised the Widget Manufacturing Process. He’s actually kind of obsessed with Widgets. Typically whenever a conversation becomes about Widgets he gushes openly and makes a bad impression with his in-laws. Since that incident he has developed the self control to pretend otherwise, and the rest of his personality is okay.
The interviewer works for a Widget Manufacturing company but seems to only care about Widgets a normal amount. He asks “How interested are you in Widgets?” He has learnt from previous job interviews that, if he answers honestly, the interviewer will think he is any of lying, insane, or too weird to deal with, and not hire him, even though this is not in the best financial interests of the company, were they fully informed.
Should he pretend to like widgets the amount most likely to get him hired, or does he have a moral obligation to keep answering honestly until he runs out of rent money and becomes homeless?
I don’t know, he could say “Honestly, I enjoy designing widgets so much that others sometimes find it strange!” That would probably work fine. I think you can actually get a way with a bit more if you say honestly first and then are actually sincere. This would also signal social awareness.
I realize that I am in some sense dodging your hypothetical but I think your hypothetical is the problem. You haven’t thought hard enough about how this guy can succeed without lying.
A Self-Improvement and Epistemics Nerd has an online community for Self-Improvement and Epistemics Nerds. Half the people reading it are autists with bad social skills, who weren’t at exactly the right age to be saved by Disney’s Tangled. They struggle with navigating ordinary social situations and obtaining true beliefs because they’re bad at lying, and insufficiently aggressive at anticipating it in others.
Would they be doing anyone a favour in encourage a social norm of truthfulness and the expectation of truthfulness in others, when all those people will inevitably have to leave the computer one day and end up like the subjects of the previous two examples? Would they be making the world a better place?
Yes and yes.
Contrary to common belief, lesswrong is not an autism support group.
And you know what? I think it made the world much better. Now we have places online and in the real world (lighthaven, meetups, Berkeley) to gather and form a community around truthseeking and rationality. I like it. I’m glad it exists. I even think some important and powerful ideas have come out of it, and I think we’ve learned a lot together.
Saying words is just an action, like any other action. Whether the words are literally true or not is just a fact about the action, like any other fact about an action. It’s not the morally important fact. You judge actions by their consequences, whether you expect it to lead to more good or bad. Then you take the action with the best consequences overall.
Saying words is an action, but it’s not like any other action, because it can guide others towards or away from the truth. Similarly, torture is an action, but it’s not like any other action, because it is when one person causes another immense pain intentionally.
Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced, and then only take them with immense regret and sorrow. There are various distinguishing factors. For instance, the consequences of torture seem likely to be almost always bad, so I never seriously consider it. Also, I don’t want to be the sort of person who tortures people (both for instrumental reasons and to some extent for intrinsic reasons). It’s actually pretty hard to fully disentangle my disprefernece for torture from its consequences, because torture is inherently about causing suffering and don’t either want suffering to exist or to cause it (though the former is far more important to me).
My feelings about lying are the same. I love the truth, I love the truthseeking process, I love seeing curiosity in the eyes of children and adults and kittens. I hate lies, confusion, and deceiving others. This is partially because the truth is really useful for agents (and I like agents to be able to exercise their potential, typically), it’s partially because telling the truth seems to be best for me in most cases, and it’s partially because I just value truth.
Rationality can be about Winning, or it can be about The Truth, but it can’t be about both. Sooner or later, your The Truth will demand you shoot yourself in the foot, while Winning will offer you a pretty girl with a country-sized dowry. The only price will be presenting various facts about yourself in the most seductive order instead of the most informative one.
It can totally be about both if truth is part of winning. Yes, there are sometimes tradeoffs, and truth is not the singular source of value. But I think most of us value it very strongly, so presenting these two axes as orthogonal is highly misleading. And I want to share the truth with other people in case they decide to value it too—if not, they can always choose not to face it.
Also, there’s a missing mood in your example. When you value the truth, being honest tends to get you a lot of other things that you value; you tend to end up surrounded by the right people for you, being the kind of person you can respect, in the kind of place where you belong, even if you have to create it.
Now, you’re probably going to say that I can’t convince you by pure reason to intrinsically value the truth. That’s right. However, I also can’t convince you by pure reason to intrinsically value literally anything, and if you had written an essay about how we should consider killing or torturing people because it’s just an action like any other, I would have objected on similar grounds. You’re totally missing the fact that it’s wrong, and also (separately!) the consequences of following your advice would probably be bad for you, and certainly for most of us, over the long run.
I enjoyed reading this reply, since it’s exactly the position I’m dissenting against phrased perfectly to make the disagreements salient.
I don’t know, he could say “Honestly, I enjoy designing widgets so much that others sometimes find it strange!” That would probably work fine. I think you can actually get a way with a bit more if you say honestly first and then are actually sincere. This would also signal social awareness.
I think this is what eliezer describes as “The code of literal truth only lets people navigate anything like ordinary social reality to the extent that they are very fast on their verbal feet”. This reply works if you can come up with it, or notice this problem in advice and plan it out, but in a face to face interview it takes quite a lot of skill (more than most people have) to phrase somethlng like that so that it comes off smoothly on a first try and without pausing to think for ten minutes. People who do not have the option of doing this because they didn’t think of it quickly enough, get to choose between telling the truth as it sits in their head or else the first lie they come up with in the time it took the interviewer to ask the question.
I’m a bit of a rationalist dedicate/monk and I’d prefer to fight than lie—however I don’t think everyone is rationally or otherwise compelled to follow suit, for reasons that will be further explained.
Now, you’re probably going to say that I can’t convince you by pure reason to intrinsically value the truth. That’s right. However, I also can’t convince you by pure reason to intrinsically value literally anything
This is exactly the heart of the disagreement! Truthtelling is a value, and you can if you want assign it so high a utility score that you wouldn’t tell one lie to stop a genocide, but that’s a fact about the values you’ve assigned things, not about what behaviours are rational in the general case or whether other people would be well-served by adopting the behavioural norms you’d encourage of them. It shouldn’t be treated as intrinsically tied to rationalism, for the same reason that Effective Altruism is a different website. In the general case, do the actions that get you the things you value, and lying is just an action, an action that harms some things and benefits others that you may or may not value.
I could try to attack the behaviour of people claiming this value if I wanted, since it doesn’t seem to make a huge amount of sense: If you value The Truth for it’s own sake while still being a Utilitarian, how much disutility is one lie in human lives? If it is more than 1⁄5000 the average person tells more than 5000 lies in their life and it’d be a public good to kill newborns before they can learn language and get started, and if it is less than 1⁄5000 Givewell sells lives for ~$5k each so you should be happy lying for a dollar. This is clearly absurd, and what you value is your own truthtelling or maybe the honesty of specifically your immediate surroundings, but again why? What is it you’re actually valuing, and have you thought about how to buy more of it?
The meaning of the foot fetish tangent at the start is, I don’t understand this value that gets espoused as so important or how it works internally. It’d be incredibly surprising to learn evolution baked something like that into the human genome. I don’t think Disney gave it to you. If it is culture it is not the sort of culture that happens because your ancestors practiced it and obtained success, but instead your parents told you not to lie because they wanted the truth from you whether it served you well to give it to them or not and then when you grow up you internalise that commandment even as everyone else is visibly breaking it in front of you. I have a hard time learning the internals of this value that many others claim to hold, because they don’t phrase it like a value, they phrase it like an iron moral law that they must obey up to the highest limits of their ability without really bothering to do consequentialism about it, even those hear who seem like devout consequentialists about other moral things like human lives.
What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.
If that was me not getting it than probably I am not going to get it and continuing to talk has deminishing returns, but I’ll try to answer your other questions too and am happy to continue replying in what I hope comes across as mutual good faith.
What did you think about my objection to the Flynn example
It was incredibly cute but the kind of thing where people’s individual results tend to vary wildly. I am glad you are happy even if it was achieved by a different policy, but I don’t think any of my main claims are strongly undermined by it.
or the value of the rationalist community as something other than an autism support group
I agree the rationalist community is not actually an autism support group, and in particular that it has value as a way for people who want to believe true things to collaborate around getting more accurate beliefs, as well as for people who want to improve the ways they think, make better decisions, optimise their lives etc. I think my thesis that truthtelling does not have the same essential character as truthseeking or truthbelieving is if not correct at least coherent and justifiable, and can be argued on its merits. I can want to believe true things so I can make better decisions without having an ideological commitment to honest speech, and people can collaborate around reaching true conclusions based on interrogating positions and seeking evidence rather than expecting and assuming honesty. For example I do not think at any point in interrogating my claims in this post you have had to assume I am honest, because I am trying to methodically attach my reasoning and justifications to everything I say and am not really expecting to be believed about things where I don’t.
The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
This seems like a non-central objection. If it is your only objection, note that I could with more careful thought have constructed a hypothetical where there was even more time pressure and an honest way to achieve their goal was even less within reach, and then we’d be back at the position my first hypothetical was intended to provoke. Unless I suppose you think there is no possible plausible social situation ever where refusing to lie predictibly backfires, but I somehow really doubt that.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
This is a more interesting reason than what I had (pessimistically) imagined, and I would count it a valid response to the side point I was making that intrinsic concern for personal truthtelling is prima facie weird. I think I agree with you that the truth is beautiful, I also read mathematics for fun and have observed it and felt the same way. I just don’t attach the same feeling to honest speech. I would want to retort that people knowing the truth is not always best served by you saying the truth, and you could still justify making terribly cutthroat utilitarian trade-offs around e.g. committing fraud to get money to fund teaching mathematics to millions of people in the third world, since it increases total amount of people knowing and understanding the truth overall. I also acknowledge regular utilitarians don’t behave like that for obvious second order reasons but my position is only that you have to think through the actual decision and not just assume the conclusion.
I feel like you sort of ignored my stronger points … without engaging with my explanation of how it doesn’t miss the point
If I ignored your strongest argument it was probably because I didn’t think it was central, didn’t think it was your strongest, or otherwise misunderstood it. I’m actually unsure looking back which part I didn’t focus on you meant for me to focus on. The “Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced” part maybe? The example you give is torture, which 1) always causes immediate severe pain by the definition of torture and 2) has been basically proven to be never useful for any goal other than causing pain in any situation you might reasonably end up in. Saying Torture is always morally repugnant is much more supported by evidence, and is very different from saying the same of an action that frequently hurts nobody and happens a hundred times a day in normal small talk.
I agree that if I could produce a wonderful truthseeking society by telling a few lies it would be worth it, I just think that extreme sincere honesty is a better path for predictable first and second order reasons.
I think about this topic a lot, and I appreciate your dissent, particularly since it helped me organize my thoughts a little. That said, I think you’re almost completely wrong. The best way to get at the problem is probably to start with your examples. Not exactly in order, sorry.
Here’s a true story. I once met a lovely and intelligent woman who didn’t like that I’m a bit blunt and ruthlessly truthseeking. I didn’t stop being that way, and mainly for that reason we didn’t become romantically involved. A few months later I met a lovely, intelligent, reasonable, sometimes blunt, and open-minded woman who did like that I’m a bit blunt and ruthlessly truthseeking. We’ve been dating for 2.5 years now and I’m on balance very happy with how everything worked out.
He certainly doesn’t have a moral obligation to tell the truth. But a lot of moral obligations change when someone points a gun at you. For instance, it becomes morally permissible (though not necessarily feasible) to shoot at them, or to give up what money you must and later steal it back at the first available opportunity. To me, the truth is something precious, and lying is like stealing the truth; it’s permissible in some extreme and usually adversarial situations. With that said, I’m a bit of a rationalist dedicate/monk and I’d prefer to fight than lie—however I don’t think everyone is rationally or otherwise compelled to follow suit, for reasons that will be further explained.
I don’t know, he could say “Honestly, I enjoy designing widgets so much that others sometimes find it strange!” That would probably work fine. I think you can actually get a way with a bit more if you say honestly first and then are actually sincere. This would also signal social awareness.
I realize that I am in some sense dodging your hypothetical but I think your hypothetical is the problem. You haven’t thought hard enough about how this guy can succeed without lying.
Yes and yes.
Contrary to common belief, lesswrong is not an autism support group.
And you know what? I think it made the world much better. Now we have places online and in the real world (lighthaven, meetups, Berkeley) to gather and form a community around truthseeking and rationality. I like it. I’m glad it exists. I even think some important and powerful ideas have come out of it, and I think we’ve learned a lot together.
Saying words is an action, but it’s not like any other action, because it can guide others towards or away from the truth. Similarly, torture is an action, but it’s not like any other action, because it is when one person causes another immense pain intentionally.
Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced, and then only take them with immense regret and sorrow. There are various distinguishing factors. For instance, the consequences of torture seem likely to be almost always bad, so I never seriously consider it. Also, I don’t want to be the sort of person who tortures people (both for instrumental reasons and to some extent for intrinsic reasons). It’s actually pretty hard to fully disentangle my disprefernece for torture from its consequences, because torture is inherently about causing suffering and don’t either want suffering to exist or to cause it (though the former is far more important to me).
My feelings about lying are the same. I love the truth, I love the truthseeking process, I love seeing curiosity in the eyes of children and adults and kittens. I hate lies, confusion, and deceiving others. This is partially because the truth is really useful for agents (and I like agents to be able to exercise their potential, typically), it’s partially because telling the truth seems to be best for me in most cases, and it’s partially because I just value truth.
It can totally be about both if truth is part of winning. Yes, there are sometimes tradeoffs, and truth is not the singular source of value. But I think most of us value it very strongly, so presenting these two axes as orthogonal is highly misleading. And I want to share the truth with other people in case they decide to value it too—if not, they can always choose not to face it.
Also, there’s a missing mood in your example. When you value the truth, being honest tends to get you a lot of other things that you value; you tend to end up surrounded by the right people for you, being the kind of person you can respect, in the kind of place where you belong, even if you have to create it.
Now, you’re probably going to say that I can’t convince you by pure reason to intrinsically value the truth. That’s right. However, I also can’t convince you by pure reason to intrinsically value literally anything, and if you had written an essay about how we should consider killing or torturing people because it’s just an action like any other, I would have objected on similar grounds. You’re totally missing the fact that it’s wrong, and also (separately!) the consequences of following your advice would probably be bad for you, and certainly for most of us, over the long run.
I enjoyed reading this reply, since it’s exactly the position I’m dissenting against phrased perfectly to make the disagreements salient.
I think this is what eliezer describes as “The code of literal truth only lets people navigate anything like ordinary social reality to the extent that they are very fast on their verbal feet”. This reply works if you can come up with it, or notice this problem in advice and plan it out, but in a face to face interview it takes quite a lot of skill (more than most people have) to phrase somethlng like that so that it comes off smoothly on a first try and without pausing to think for ten minutes. People who do not have the option of doing this because they didn’t think of it quickly enough, get to choose between telling the truth as it sits in their head or else the first lie they come up with in the time it took the interviewer to ask the question.
This is exactly the heart of the disagreement! Truthtelling is a value, and you can if you want assign it so high a utility score that you wouldn’t tell one lie to stop a genocide, but that’s a fact about the values you’ve assigned things, not about what behaviours are rational in the general case or whether other people would be well-served by adopting the behavioural norms you’d encourage of them. It shouldn’t be treated as intrinsically tied to rationalism, for the same reason that Effective Altruism is a different website. In the general case, do the actions that get you the things you value, and lying is just an action, an action that harms some things and benefits others that you may or may not value.
I could try to attack the behaviour of people claiming this value if I wanted, since it doesn’t seem to make a huge amount of sense: If you value The Truth for it’s own sake while still being a Utilitarian, how much disutility is one lie in human lives? If it is more than 1⁄5000 the average person tells more than 5000 lies in their life and it’d be a public good to kill newborns before they can learn language and get started, and if it is less than 1⁄5000 Givewell sells lives for ~$5k each so you should be happy lying for a dollar. This is clearly absurd, and what you value is your own truthtelling or maybe the honesty of specifically your immediate surroundings, but again why? What is it you’re actually valuing, and have you thought about how to buy more of it?
The meaning of the foot fetish tangent at the start is, I don’t understand this value that gets espoused as so important or how it works internally. It’d be incredibly surprising to learn evolution baked something like that into the human genome. I don’t think Disney gave it to you. If it is culture it is not the sort of culture that happens because your ancestors practiced it and obtained success, but instead your parents told you not to lie because they wanted the truth from you whether it served you well to give it to them or not and then when you grow up you internalise that commandment even as everyone else is visibly breaking it in front of you. I have a hard time learning the internals of this value that many others claim to hold, because they don’t phrase it like a value, they phrase it like an iron moral law that they must obey up to the highest limits of their ability without really bothering to do consequentialism about it, even those hear who seem like devout consequentialists about other moral things like human lives.
I don’t think you got it.
What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.
If that was me not getting it than probably I am not going to get it and continuing to talk has deminishing returns, but I’ll try to answer your other questions too and am happy to continue replying in what I hope comes across as mutual good faith.
It was incredibly cute but the kind of thing where people’s individual results tend to vary wildly. I am glad you are happy even if it was achieved by a different policy, but I don’t think any of my main claims are strongly undermined by it.
I agree the rationalist community is not actually an autism support group, and in particular that it has value as a way for people who want to believe true things to collaborate around getting more accurate beliefs, as well as for people who want to improve the ways they think, make better decisions, optimise their lives etc. I think my thesis that truthtelling does not have the same essential character as truthseeking or truthbelieving is if not correct at least coherent and justifiable, and can be argued on its merits. I can want to believe true things so I can make better decisions without having an ideological commitment to honest speech, and people can collaborate around reaching true conclusions based on interrogating positions and seeking evidence rather than expecting and assuming honesty. For example I do not think at any point in interrogating my claims in this post you have had to assume I am honest, because I am trying to methodically attach my reasoning and justifications to everything I say and am not really expecting to be believed about things where I don’t.
This seems like a non-central objection. If it is your only objection, note that I could with more careful thought have constructed a hypothetical where there was even more time pressure and an honest way to achieve their goal was even less within reach, and then we’d be back at the position my first hypothetical was intended to provoke. Unless I suppose you think there is no possible plausible social situation ever where refusing to lie predictibly backfires, but I somehow really doubt that.
The only number in my “how much bad is a lie if you think a lie is bad” hypothetical is taken from https://www.givewell.org/charities/top-charities under “effectiveness”, rounded up. The assumption that you have to assign a number is a reference to coherent decisions imply consistent utilities, and the other numbers are made up to explore the consequences of doing so.
This is a more interesting reason than what I had (pessimistically) imagined, and I would count it a valid response to the side point I was making that intrinsic concern for personal truthtelling is prima facie weird. I think I agree with you that the truth is beautiful, I also read mathematics for fun and have observed it and felt the same way. I just don’t attach the same feeling to honest speech. I would want to retort that people knowing the truth is not always best served by you saying the truth, and you could still justify making terribly cutthroat utilitarian trade-offs around e.g. committing fraud to get money to fund teaching mathematics to millions of people in the third world, since it increases total amount of people knowing and understanding the truth overall. I also acknowledge regular utilitarians don’t behave like that for obvious second order reasons but my position is only that you have to think through the actual decision and not just assume the conclusion.
If I ignored your strongest argument it was probably because I didn’t think it was central, didn’t think it was your strongest, or otherwise misunderstood it. I’m actually unsure looking back which part I didn’t focus on you meant for me to focus on. The “Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced” part maybe? The example you give is torture, which 1) always causes immediate severe pain by the definition of torture and 2) has been basically proven to be never useful for any goal other than causing pain in any situation you might reasonably end up in. Saying Torture is always morally repugnant is much more supported by evidence, and is very different from saying the same of an action that frequently hurts nobody and happens a hundred times a day in normal small talk.
I agree that if I could produce a wonderful truthseeking society by telling a few lies it would be worth it, I just think that extreme sincere honesty is a better path for predictable first and second order reasons.