What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.
If that was me not getting it than probably I am not going to get it and continuing to talk has deminishing returns, but I’ll try to answer your other questions too and am happy to continue replying in what I hope comes across as mutual good faith.
What did you think about my objection to the Flynn example
It was incredibly cute but the kind of thing where people’s individual results tend to vary wildly. I am glad you are happy even if it was achieved by a different policy, but I don’t think any of my main claims are strongly undermined by it.
or the value of the rationalist community as something other than an autism support group
I agree the rationalist community is not actually an autism support group, and in particular that it has value as a way for people who want to believe true things to collaborate around getting more accurate beliefs, as well as for people who want to improve the ways they think, make better decisions, optimise their lives etc. I think my thesis that truthtelling does not have the same essential character as truthseeking or truthbelieving is if not correct at least coherent and justifiable, and can be argued on its merits. I can want to believe true things so I can make better decisions without having an ideological commitment to honest speech, and people can collaborate around reaching true conclusions based on interrogating positions and seeking evidence rather than expecting and assuming honesty. For example I do not think at any point in interrogating my claims in this post you have had to assume I am honest, because I am trying to methodically attach my reasoning and justifications to everything I say and am not really expecting to be believed about things where I don’t.
The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
This seems like a non-central objection. If it is your only objection, note that I could with more careful thought have constructed a hypothetical where there was even more time pressure and an honest way to achieve their goal was even less within reach, and then we’d be back at the position my first hypothetical was intended to provoke. Unless I suppose you think there is no possible plausible social situation ever where refusing to lie predictibly backfires, but I somehow really doubt that.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
This is a more interesting reason than what I had (pessimistically) imagined, and I would count it a valid response to the side point I was making that intrinsic concern for personal truthtelling is prima facie weird. I think I agree with you that the truth is beautiful, I also read mathematics for fun and have observed it and felt the same way. I just don’t attach the same feeling to honest speech. I would want to retort that people knowing the truth is not always best served by you saying the truth, and you could still justify making terribly cutthroat utilitarian trade-offs around e.g. committing fraud to get money to fund teaching mathematics to millions of people in the third world, since it increases total amount of people knowing and understanding the truth overall. I also acknowledge regular utilitarians don’t behave like that for obvious second order reasons but my position is only that you have to think through the actual decision and not just assume the conclusion.
I feel like you sort of ignored my stronger points … without engaging with my explanation of how it doesn’t miss the point
If I ignored your strongest argument it was probably because I didn’t think it was central, didn’t think it was your strongest, or otherwise misunderstood it. I’m actually unsure looking back which part I didn’t focus on you meant for me to focus on. The “Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced” part maybe? The example you give is torture, which 1) always causes immediate severe pain by the definition of torture and 2) has been basically proven to be never useful for any goal other than causing pain in any situation you might reasonably end up in. Saying Torture is always morally repugnant is much more supported by evidence, and is very different from saying the same of an action that frequently hurts nobody and happens a hundred times a day in normal small talk.
I agree that if I could produce a wonderful truthseeking society by telling a few lies it would be worth it, I just think that extreme sincere honesty is a better path for predictable first and second order reasons.
I don’t think you got it.
What did you think about my objection to the Flynn example, or the value of the rationalist community as something other than an autism support group? I feel like you sort of ignored my stronger points and then singled out the widget job interview response because it seems to miss the point, but without engaging with my explanation of how it doesn’t miss the point. The way that you constructed the hypothetical there was plenty of time to come up with an honest way to talk about how much he enjoyed widgets.
The one of the things I value is people knowing and understanding the truth, which I find to be a beautiful thing. It’s not because someone told me to be honest at some point, it’s because I’ve done a lot of mathematics and read a lot of books and observed that the truth is beautiful.
I also wouldn’t shoot someone so I could tell someone else the truth. I don’t know where you got these numbers.
I suppose I’m not completely longtermist about my pursuit of truth, but I’m not completely longtermist about my other values either—sometimes the short term is easier to predict and get feedback from etc.
If that was me not getting it than probably I am not going to get it and continuing to talk has deminishing returns, but I’ll try to answer your other questions too and am happy to continue replying in what I hope comes across as mutual good faith.
It was incredibly cute but the kind of thing where people’s individual results tend to vary wildly. I am glad you are happy even if it was achieved by a different policy, but I don’t think any of my main claims are strongly undermined by it.
I agree the rationalist community is not actually an autism support group, and in particular that it has value as a way for people who want to believe true things to collaborate around getting more accurate beliefs, as well as for people who want to improve the ways they think, make better decisions, optimise their lives etc. I think my thesis that truthtelling does not have the same essential character as truthseeking or truthbelieving is if not correct at least coherent and justifiable, and can be argued on its merits. I can want to believe true things so I can make better decisions without having an ideological commitment to honest speech, and people can collaborate around reaching true conclusions based on interrogating positions and seeking evidence rather than expecting and assuming honesty. For example I do not think at any point in interrogating my claims in this post you have had to assume I am honest, because I am trying to methodically attach my reasoning and justifications to everything I say and am not really expecting to be believed about things where I don’t.
This seems like a non-central objection. If it is your only objection, note that I could with more careful thought have constructed a hypothetical where there was even more time pressure and an honest way to achieve their goal was even less within reach, and then we’d be back at the position my first hypothetical was intended to provoke. Unless I suppose you think there is no possible plausible social situation ever where refusing to lie predictibly backfires, but I somehow really doubt that.
The only number in my “how much bad is a lie if you think a lie is bad” hypothetical is taken from https://www.givewell.org/charities/top-charities under “effectiveness”, rounded up. The assumption that you have to assign a number is a reference to coherent decisions imply consistent utilities, and the other numbers are made up to explore the consequences of doing so.
This is a more interesting reason than what I had (pessimistically) imagined, and I would count it a valid response to the side point I was making that intrinsic concern for personal truthtelling is prima facie weird. I think I agree with you that the truth is beautiful, I also read mathematics for fun and have observed it and felt the same way. I just don’t attach the same feeling to honest speech. I would want to retort that people knowing the truth is not always best served by you saying the truth, and you could still justify making terribly cutthroat utilitarian trade-offs around e.g. committing fraud to get money to fund teaching mathematics to millions of people in the third world, since it increases total amount of people knowing and understanding the truth overall. I also acknowledge regular utilitarians don’t behave like that for obvious second order reasons but my position is only that you have to think through the actual decision and not just assume the conclusion.
If I ignored your strongest argument it was probably because I didn’t think it was central, didn’t think it was your strongest, or otherwise misunderstood it. I’m actually unsure looking back which part I didn’t focus on you meant for me to focus on. The “Sure, we judge actions by their consequences, but we do not judge all actions in the same way. Some of them are morally repugnant, and we try very, very hard to never take them unless our hands our forced” part maybe? The example you give is torture, which 1) always causes immediate severe pain by the definition of torture and 2) has been basically proven to be never useful for any goal other than causing pain in any situation you might reasonably end up in. Saying Torture is always morally repugnant is much more supported by evidence, and is very different from saying the same of an action that frequently hurts nobody and happens a hundred times a day in normal small talk.
I agree that if I could produce a wonderful truthseeking society by telling a few lies it would be worth it, I just think that extreme sincere honesty is a better path for predictable first and second order reasons.