Lie by default whenever you think it passes an Expected Value Calculation to do so, just as for any other action.
How do you propose to approximately carry out such a process, and how much effort do you put into pretending to do the calculation?
I’m not as much a stickler/purist/believer in honest-as-always-good as many around here, I think there are many times that deception of some sort is a valid, good, or even morally required choice. I definitely think e.g. Kant was wrong about honesty as a maxim, even within his own framework. But, in practice, I think your proposed policy sets much too low a standard, and in practice the gap between what you proposed vs “Lie by default whenever it passes an Expected Value Calculation to do so, just as for any other action,” is enormous in both the theoretical defensibility, and in the skillfulness (and internal levels of honesty and self-awareness) required to successfully execute it.
How do you propose to approximately carry out such a process, and how much effort do you put into pretending to do the calculation?
The thing I am trying to gesture at might be better phrased as “do it if it seems like a good idea, by the same measures as you’d judge if any other action was a good idea”, but then I worry some overly consciencious people will just always judge lying to be a bad idea and stay in the same trap, so I kind of want to say “do it if it seems like a good idea and don’t just immediately dismiss it or assign some huge unjustifiable negative weight to all actions that involve lying” but then I worry they’ll argue over how much of a negative weight can be justified so I also want to say “assign lying a negative weight proportional to a sensible assessment of the risks involved and the actual harm to the commons of doing it and not some other bigger weight” and at some point I gave up and wrote what I wrote above instead.
Putting too much thought into making a decision is also not a useful behavioural pattern but probably the topic of a different post, many others have written about it already I think.
I think your proposed policy sets much too low a standard, and in practice the gap between what you proposed vs “Lie by default whenever it passes an Expected Value Calculation to do so, just as for any other action,” is enormous
I would love to hear alternative proposed standards that are actually workable in real life and don’t amount to tying a chain around your own leg, from other non-believers in ‘honest-as-always-good’. If there were ten posts like this we could put them in a line and people could pick a point on that line that feels right.
I think this issue of the difficulty of making each decision about lying as an independent decision is the main argument for treating it as a virtue ethics or deontological issue.
I think you make many good points in the essay arguing that one should not simply follow a rule of honesty. I think that in practice the difference can be split, and that is in fact what most rationalists and other wise human beings do. I also think it is highly useful to write this essay on the mini virtues of lying, so that that difference can be split well.
There are many subtle downsides to lying, so simply adding a bit of a fudge factor to the decision that weighs against it is one way to avoid taking forever to make that decision. You’ve talked about practicing making the decision quickly, and I suspect that is the result of that practice.
This is a separate issue, but your point about being technically correct is also a valuable one. It is clearly not being honest to say things you know will cause the listener to form false beliefs.
I have probably aired on the side of honesty as have many rationalists, treating it not as an absolute deontological issue and being willing to fudge a little on the side of technically correct to maintain social graces in some situations. I enjoy a remarkable degree of trust from my true friends, because they know me to be reliably honest. However, I have probably suffered reputational damages from acquaintances and failed friends, for whom my exceptional honesty has proven hurtful. Those people don’t have adequate experience with me to see that I am reliably honest and appreciate the advantages of having a friend who can be relied upon to tell the truth. That’s because they’ve ceased being my friend when they’ve been either insulted or irritated by my unhelpful honesty.
There is much here I agree with and much I disagree with. But I think this topic is hugely valuable for the rationalist community, and you’ve written it up very well. Nice work!
We apply different standards of behavior for different types of choices all the time (in terms of how much effort to put into the decision process), mostly successfully. So I read this reply as something like, “Which category of ‘How high a standard should I use?’ do you put ‘Should I lie right now?’ in?”
A good starting point might be: One rank higher than you would for not lying, see how it goes and adjust over time. If I tried to make an effort-ranking of all the kinds of tasks I regularly engage in, I expect there would be natural clusters I can roughly draw an axis through. E.g. I put more effort into client-facing or boss-facing tasks at work than I do into casual conversations with random strangers. I put more effort into setting the table and washing dishes and plating food for holidays than for a random Tuesday. Those are probably more than one rank apart, but for any given situation, I think the bar for lying should be somewhere in the vicinity of that size gap.
How do you propose to approximately carry out such a process, and how much effort do you put into pretending to do the calculation?
I’m not as much a stickler/purist/believer in honest-as-always-good as many around here, I think there are many times that deception of some sort is a valid, good, or even morally required choice. I definitely think e.g. Kant was wrong about honesty as a maxim, even within his own framework. But, in practice, I think your proposed policy sets much too low a standard, and in practice the gap between what you proposed vs “Lie by default whenever it passes an Expected Value Calculation to do so, just as for any other action,” is enormous in both the theoretical defensibility, and in the skillfulness (and internal levels of honesty and self-awareness) required to successfully execute it.
The thing I am trying to gesture at might be better phrased as “do it if it seems like a good idea, by the same measures as you’d judge if any other action was a good idea”, but then I worry some overly consciencious people will just always judge lying to be a bad idea and stay in the same trap, so I kind of want to say “do it if it seems like a good idea and don’t just immediately dismiss it or assign some huge unjustifiable negative weight to all actions that involve lying” but then I worry they’ll argue over how much of a negative weight can be justified so I also want to say “assign lying a negative weight proportional to a sensible assessment of the risks involved and the actual harm to the commons of doing it and not some other bigger weight” and at some point I gave up and wrote what I wrote above instead.
Putting too much thought into making a decision is also not a useful behavioural pattern but probably the topic of a different post, many others have written about it already I think.
I would love to hear alternative proposed standards that are actually workable in real life and don’t amount to tying a chain around your own leg, from other non-believers in ‘honest-as-always-good’. If there were ten posts like this we could put them in a line and people could pick a point on that line that feels right.
I think this issue of the difficulty of making each decision about lying as an independent decision is the main argument for treating it as a virtue ethics or deontological issue.
I think you make many good points in the essay arguing that one should not simply follow a rule of honesty. I think that in practice the difference can be split, and that is in fact what most rationalists and other wise human beings do. I also think it is highly useful to write this essay on the mini virtues of lying, so that that difference can be split well.
There are many subtle downsides to lying, so simply adding a bit of a fudge factor to the decision that weighs against it is one way to avoid taking forever to make that decision. You’ve talked about practicing making the decision quickly, and I suspect that is the result of that practice.
This is a separate issue, but your point about being technically correct is also a valuable one. It is clearly not being honest to say things you know will cause the listener to form false beliefs.
I have probably aired on the side of honesty as have many rationalists, treating it not as an absolute deontological issue and being willing to fudge a little on the side of technically correct to maintain social graces in some situations. I enjoy a remarkable degree of trust from my true friends, because they know me to be reliably honest. However, I have probably suffered reputational damages from acquaintances and failed friends, for whom my exceptional honesty has proven hurtful. Those people don’t have adequate experience with me to see that I am reliably honest and appreciate the advantages of having a friend who can be relied upon to tell the truth. That’s because they’ve ceased being my friend when they’ve been either insulted or irritated by my unhelpful honesty.
There is much here I agree with and much I disagree with. But I think this topic is hugely valuable for the rationalist community, and you’ve written it up very well. Nice work!
We apply different standards of behavior for different types of choices all the time (in terms of how much effort to put into the decision process), mostly successfully. So I read this reply as something like, “Which category of ‘How high a standard should I use?’ do you put ‘Should I lie right now?’ in?”
A good starting point might be: One rank higher than you would for not lying, see how it goes and adjust over time. If I tried to make an effort-ranking of all the kinds of tasks I regularly engage in, I expect there would be natural clusters I can roughly draw an axis through. E.g. I put more effort into client-facing or boss-facing tasks at work than I do into casual conversations with random strangers. I put more effort into setting the table and washing dishes and plating food for holidays than for a random Tuesday. Those are probably more than one rank apart, but for any given situation, I think the bar for lying should be somewhere in the vicinity of that size gap.