I enjoyed reading this reply, since it’s exactly the position I’m dissenting against phrased perfectly to make the disagreements salient.
I don’t know, he could say “Honestly, I enjoy designing widgets so much that others sometimes find it strange!” That would probably work fine. I think you can actually get a way with a bit more if you say honestly first and then are actually sincere. This would also signal social awareness.
I think this is what eliezer describes as “The code of literal truth only lets people navigate anything like ordinary social reality to the extent that they are very fast on their verbal feet”. This reply works if you can come up with it, or notice this problem in advice and plan it out, but in a face to face interview it takes quite a lot of skill (more than most people have) to phrase somethlng like that so that it comes off smoothly on a first try and without pausing to think for ten minutes. People who do not have the option of doing this because they didn’t think of it quickly enough, get to choose between telling the truth as it sits in their head or else the first lie they come up with in the time it took the interviewer to ask the question.
I’m a bit of a rationalist dedicate/monk and I’d prefer to fight than lie—however I don’t think everyone is rationally or otherwise compelled to follow suit, for reasons that will be further explained.
Now, you’re probably going to say that I can’t convince you by pure reason to intrinsically value the truth. That’s right. However, I also can’t convince you by pure reason to intrinsically value literally anything
This is exactly the heart of the disagreement! Truthtelling is a value, and you can if you want assign it so high a utility score that you wouldn’t tell one lie to stop a genocide, but that’s a fact about the values you’ve assigned things, not about what behaviours are rational in the general case or whether other people would be well-served by adopting the behavioural norms you’d encourage of them. It shouldn’t be treated as intrinsically tied to rationalism, for the same reason that Effective Altruism is a different website. In the general case, do the actions that get you the things you value, and lying is just an action, an action that harms some things and benefits others that you may or may not value.
I could try to attack the behaviour of people claiming this value if I wanted, since it doesn’t seem to make a huge amount of sense: If you value The Truth for it’s own sake while still being a Utilitarian, how much disutility is one lie in human lives? If it is more than 1⁄5000 the average person tells more than 5000 lies in their life and it’d be a public good to kill newborns before they can learn language and get started, and if it is less than 1⁄5000 Givewell sells lives for ~$5k each so you should be happy lying for a dollar. This is clearly absurd, and what you value is your own truthtelling or maybe the honesty of specifically your immediate surroundings, but again why? What is it you’re actually valuing, and have you thought about how to buy more of it?
The meaning of the foot fetish tangent at the start is, I don’t understand this value that gets espoused as so important or how it works internally. It’d be incredibly surprising to learn evolution baked something like that into the human genome. I don’t think Disney gave it to you. If it is culture it is not the sort of culture that happens because your ancestors practiced it and obtained success, but instead your parents told you not to lie because they wanted the truth from you whether it served you well to give it to them or not and then when you grow up you internalise that commandment even as everyone else is visibly breaking it in front of you. I have a hard time learning the internals of this value that many others claim to hold, because they don’t phrase it like a value, they phrase it like an iron moral law that they must obey up to the highest limits of their ability without really bothering to do consequentialism about it, even those hear who seem like devout consequentialists about other moral things like human lives.
What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.
If that was me not getting it than probably I am not going to get it and continuing to talk has deminishing returns, but I’ll try to answer your other questions too and am happy to continue replying in what I hope comes across as mutual good faith.
What did you think about my objection to the Flynn example
It was incredibly cute but the kind of thing where people’s individual results tend to vary wildly. I am glad you are happy even if it was achieved by a different policy, but I don’t think any of my main claims are strongly undermined by it.
or the value of the rationalist community as something other than an autism support group
I agree the rationalist community is not actually an autism support group, and in particular that it has value as a way for people who want to believe true things to collaborate around getting more accurate beliefs, as well as for people who want to improve the ways they think, make better decisions, optimise their lives etc. I think my thesis that truthtelling does not have the same essential character as truthseeking or truthbelieving is if not correct at least coherent and justifiable, and can be argued on its merits. I can want to believe true things so I can make better decisions without having an ideological commitment to honest speech, and people can collaborate around reaching true conclusions based on interrogating positions and seeking evidence rather than expecting and assuming honesty. For example I do not think at any point in interrogating my claims in this post you have had to assume I am honest, because I am trying to methodically attach my reasoning and justifications to everything I say and am not really expecting to be believed about things where I don’t.
The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
This seems like a non-central objection. If it is your only objection, note that I could with more careful thought have constructed a hypothetical where there was even more time pressure and an honest way to achieve their goal was even less within reach, and then we’d be back at the position my first hypothetical was intended to provoke. Unless I suppose you think there is no possible plausible social situation ever where refusing to lie predictibly backfires, but I somehow really doubt that.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
This is a more interesting reason than what I had (pessimistically) imagined, and I would count it a valid response to the side point I was making that intrinsic concern for personal truthtelling is prima facie weird. I think I agree with you that the truth is beautiful, I also read mathematics for fun and have observed it and felt the same way. I just don’t attach the same feeling to honest speech. I would want to retort that people knowing the truth is not always best served by you saying the truth, and you could still justify making terribly cutthroat utilitarian trade-offs around e.g. committing fraud to get money to fund teaching mathematics to millions of people in the third world, since it increases total amount of people knowing and understanding the truth overall. I also acknowledge regular utilitarians don’t behave like that for obvious second order reasons but my position is only that you have to think through the actual decision and not just assume the conclusion.
I feel like you sort of ignored my stronger points … without engaging with my explanation of how it doesn’t miss the point
If I ignored your strongest argument it was probably because I didn’t think it was central, didn’t think it was your strongest, or otherwise misunderstood it. I’m actually unsure looking back which part I didn’t focus on you meant for me to focus on. The “Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced” part maybe? The example you give is torture, which 1) always causes immediate severe pain by the definition of torture and 2) has been basically proven to be never useful for any goal other than causing pain in any situation you might reasonably end up in. Saying Torture is always morally repugnant is much more supported by evidence, and is very different from saying the same of an action that frequently hurts nobody and happens a hundred times a day in normal small talk.
I agree that if I could produce a wonderful truthseeking society by telling a few lies it would be worth it, I just think that extreme sincere honesty is a better path for predictable first and second order reasons.
I enjoyed reading this reply, since it’s exactly the position I’m dissenting against phrased perfectly to make the disagreements salient.
I think this is what eliezer describes as “The code of literal truth only lets people navigate anything like ordinary social reality to the extent that they are very fast on their verbal feet”. This reply works if you can come up with it, or notice this problem in advice and plan it out, but in a face to face interview it takes quite a lot of skill (more than most people have) to phrase somethlng like that so that it comes off smoothly on a first try and without pausing to think for ten minutes. People who do not have the option of doing this because they didn’t think of it quickly enough, get to choose between telling the truth as it sits in their head or else the first lie they come up with in the time it took the interviewer to ask the question.
This is exactly the heart of the disagreement! Truthtelling is a value, and you can if you want assign it so high a utility score that you wouldn’t tell one lie to stop a genocide, but that’s a fact about the values you’ve assigned things, not about what behaviours are rational in the general case or whether other people would be well-served by adopting the behavioural norms you’d encourage of them. It shouldn’t be treated as intrinsically tied to rationalism, for the same reason that Effective Altruism is a different website. In the general case, do the actions that get you the things you value, and lying is just an action, an action that harms some things and benefits others that you may or may not value.
I could try to attack the behaviour of people claiming this value if I wanted, since it doesn’t seem to make a huge amount of sense: If you value The Truth for it’s own sake while still being a Utilitarian, how much disutility is one lie in human lives? If it is more than 1⁄5000 the average person tells more than 5000 lies in their life and it’d be a public good to kill newborns before they can learn language and get started, and if it is less than 1⁄5000 Givewell sells lives for ~$5k each so you should be happy lying for a dollar. This is clearly absurd, and what you value is your own truthtelling or maybe the honesty of specifically your immediate surroundings, but again why? What is it you’re actually valuing, and have you thought about how to buy more of it?
The meaning of the foot fetish tangent at the start is, I don’t understand this value that gets espoused as so important or how it works internally. It’d be incredibly surprising to learn evolution baked something like that into the human genome. I don’t think Disney gave it to you. If it is culture it is not the sort of culture that happens because your ancestors practiced it and obtained success, but instead your parents told you not to lie because they wanted the truth from you whether it served you well to give it to them or not and then when you grow up you internalise that commandment even as everyone else is visibly breaking it in front of you. I have a hard time learning the internals of this value that many others claim to hold, because they don’t phrase it like a value, they phrase it like an iron moral law that they must obey up to the highest limits of their ability without really bothering to do consequentialism about it, even those hear who seem like devout consequentialists about other moral things like human lives.
I don’t think you got it.
What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.
If that was me not getting it than probably I am not going to get it and continuing to talk has deminishing returns, but I’ll try to answer your other questions too and am happy to continue replying in what I hope comes across as mutual good faith.
It was incredibly cute but the kind of thing where people’s individual results tend to vary wildly. I am glad you are happy even if it was achieved by a different policy, but I don’t think any of my main claims are strongly undermined by it.
I agree the rationalist community is not actually an autism support group, and in particular that it has value as a way for people who want to believe true things to collaborate around getting more accurate beliefs, as well as for people who want to improve the ways they think, make better decisions, optimise their lives etc. I think my thesis that truthtelling does not have the same essential character as truthseeking or truthbelieving is if not correct at least coherent and justifiable, and can be argued on its merits. I can want to believe true things so I can make better decisions without having an ideological commitment to honest speech, and people can collaborate around reaching true conclusions based on interrogating positions and seeking evidence rather than expecting and assuming honesty. For example I do not think at any point in interrogating my claims in this post you have had to assume I am honest, because I am trying to methodically attach my reasoning and justifications to everything I say and am not really expecting to be believed about things where I don’t.
This seems like a non-central objection. If it is your only objection, note that I could with more careful thought have constructed a hypothetical where there was even more time pressure and an honest way to achieve their goal was even less within reach, and then we’d be back at the position my first hypothetical was intended to provoke. Unless I suppose you think there is no possible plausible social situation ever where refusing to lie predictibly backfires, but I somehow really doubt that.
The only number in my “how much bad is a lie if you think a lie is bad” hypothetical is taken from https://www.givewell.org/charities/top-charities under “effectiveness”, rounded up. The assumption that you have to assign a number is a reference to coherent decisions imply consistent utilities, and the other numbers are made up to explore the consequences of doing so.
This is a more interesting reason than what I had (pessimistically) imagined, and I would count it a valid response to the side point I was making that intrinsic concern for personal truthtelling is prima facie weird. I think I agree with you that the truth is beautiful, I also read mathematics for fun and have observed it and felt the same way. I just don’t attach the same feeling to honest speech. I would want to retort that people knowing the truth is not always best served by you saying the truth, and you could still justify making terribly cutthroat utilitarian trade-offs around e.g. committing fraud to get money to fund teaching mathematics to millions of people in the third world, since it increases total amount of people knowing and understanding the truth overall. I also acknowledge regular utilitarians don’t behave like that for obvious second order reasons but my position is only that you have to think through the actual decision and not just assume the conclusion.
If I ignored your strongest argument it was probably because I didn’t think it was central, didn’t think it was your strongest, or otherwise misunderstood it. I’m actually unsure looking back which part I didn’t focus on you meant for me to focus on. The “Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced” part maybe? The example you give is torture, which 1) always causes immediate severe pain by the definition of torture and 2) has been basically proven to be never useful for any goal other than causing pain in any situation you might reasonably end up in. Saying Torture is always morally repugnant is much more supported by evidence, and is very different from saying the same of an action that frequently hurts nobody and happens a hundred times a day in normal small talk.
I agree that if I could produce a wonderful truthseeking society by telling a few lies it would be worth it, I just think that extreme sincere honesty is a better path for predictable first and second order reasons.