Eliezer, please don’t think you can offend me by disagreeing with me or questioning my opinions—every disagreement (between rational people) is another precious opportunity for someone (hopefully me!) to get closer to Truth; if the person correcting me is someone I believe with high probability to be smarter than me, or to have thought through the issue at hand better than I have (and you fit those criteria!), this only raises the probability that it is I who stand to benefit from the disagreement.
I’m not certain this is a very good answer to your question, but 1) I would not take such a pill, because I enjoy empathy and don’t think pain is always bad, 2) peoples’ deaths negatively affect many people (both through the ontologically positive grief incurred by the loss and the through ontologically negative utility they would have produced), and that negative effect is very likely to make its way to me through the Web of human interaction, especially if the deceased are young and have not yet had much of a chance to spread utility through the Web, and 3) I would have to be quite efficient at killing 12-year-olds for it to be worth my time to do it for a dollar each (although of course this is tangential to your question, since the amount “a dollar” was arbitrary).
I should also point out that I have a strongly negative psychological reaction to violence. For example, I find the though of playing a first-person shooting game repugnant, because even pretending to shoot people makes me feel terrible. I just don’t know what there is out there worse than human beings deliberately doing physical harm to one another. As a child, I felt little empathy for my fellow humans, but at some point, it was as if I was treated with Ludovico’s Technique (à la A Clockwork Orange)… maybe some key mirror neurons in my prefrontal cortex just needed time to develop.
Thank you for taking time to make me think about this!
If your moral code penalizes things that make you feel bad, and doing X would make you feel bad, then is it fair to say that not doing X is part of your moral code?
I think the point Eliezer was getting at is that human morality is very complex, and statements like “I’m an egoist” sweep a lot of that under the rug. And to continue his example: what if the pill not only prevented all pain from your conscience, but also gave you enjoyment (in the form of seratonin or whatever) at least as good as what you get from empathy?
You’re right, human morality is more complex than I thought it was when “I am an egoist” seemed like a reasonable assertion, and all the fuzzies I got from “resolving” the question of ethics prevented me from properly updating my beliefs about my own ethical disposition.
Eliezer, please don’t think you can offend me by disagreeing with me or questioning my opinions—every disagreement (between rational people) is another precious opportunity for someone (hopefully me!) to get closer to Truth; if the person correcting me is someone I believe with high probability to be smarter than me, or to have thought through the issue at hand better than I have (and you fit those criteria!), this only raises the probability that it is I who stand to benefit from the disagreement.
I’m not certain this is a very good answer to your question, but 1) I would not take such a pill, because I enjoy empathy and don’t think pain is always bad, 2) peoples’ deaths negatively affect many people (both through the ontologically positive grief incurred by the loss and the through ontologically negative utility they would have produced), and that negative effect is very likely to make its way to me through the Web of human interaction, especially if the deceased are young and have not yet had much of a chance to spread utility through the Web, and 3) I would have to be quite efficient at killing 12-year-olds for it to be worth my time to do it for a dollar each (although of course this is tangential to your question, since the amount “a dollar” was arbitrary).
I should also point out that I have a strongly negative psychological reaction to violence. For example, I find the though of playing a first-person shooting game repugnant, because even pretending to shoot people makes me feel terrible. I just don’t know what there is out there worse than human beings deliberately doing physical harm to one another. As a child, I felt little empathy for my fellow humans, but at some point, it was as if I was treated with Ludovico’s Technique (à la A Clockwork Orange)… maybe some key mirror neurons in my prefrontal cortex just needed time to develop.
Thank you for taking time to make me think about this!
If your moral code penalizes things that make you feel bad, and doing X would make you feel bad, then is it fair to say that not doing X is part of your moral code?
I think the point Eliezer was getting at is that human morality is very complex, and statements like “I’m an egoist” sweep a lot of that under the rug. And to continue his example: what if the pill not only prevented all pain from your conscience, but also gave you enjoyment (in the form of seratonin or whatever) at least as good as what you get from empathy?
You’re right, human morality is more complex than I thought it was when “I am an egoist” seemed like a reasonable assertion, and all the fuzzies I got from “resolving” the question of ethics prevented me from properly updating my beliefs about my own ethical disposition.
Statements like I’m an altruist do too. They are however less likley to be challenged.