I’m still trying to understand what Eliezer really means by this question. Here is a list of a few reasons why I don’t kill the annoying kid across the street. Which of these reasons might disappear upon my being shown this proof?
1. The kid and his friends and family would suffer, and since I don’t enjoy suffering myself, my ability to empathise stops me wanting to.
2. I would probably be arrested and jailed, which doesn’t fit in with my plans.
3. I have an emotional reaction to the idea of killing a kid (in such circumstances—though I’m not actually sure that this disclaimer is necessary): it fills me with such revulsion that I doubt I would actually be able to carry out the task. My emotions would prevent my body working properly.
4. I recognise that the kid is not causing very much harm to me. It seems fair to cause little harm to him in return.
5. My family and friends might suffer because they might imagine they could have prevented my doing this and failed to (guilt, I suppose is the word); see 1, also this reaction is even stronger because I have vested interests in my friends and family not suffering.
6. I myself would suffer guilt as a result of 1, 3 and 4, and I don’t enjoy suffering.
I suppose 2 wouldn’t change, because “it all adds up to normality” (although, as I said in my last comment, I don’t think this could add up to normality; hence my trying to understand the question better), so other people’s actions would not be altered. It would be something in me that changed: a new understanding that affected my value judgements. What would it affect? The fact that I don’t like suffering, which would take out 1 and 6? My ability to empathise, taking out 1 and 5? My emotional reactions, taking out 3 and possibly 6? My ability to judge what is fair and what is unfair—or the fact that I care about acting fairly—taking out 4?
Perhaps all I’ve done here is attempt to Taboo the concept of morality for one particular case. Saying “it’s immoral to kill the kid” suggests that the concept of morality not really existing makes sense. My list reveals that I, at least, can’t make sense of it. I’m still confused as to what the question really means.
Are there no vegetarians on OvBias?
I guess I don’t properly understand the question. I don’t know what “nothing is moral and nothing is right” means. To me, morality appears to be an internal thing, not something imposed from the outside: it’s inextricably bound up with my desires and motives and thoughts, and with everyone else’s. So how can you remove morality without changing the desires and motives and thoughts so that I would no longer recognise them as anything to do with me, or removing them entirely? You can decide that it might be convenient to have pi equal to three, but it transpires that you can’t just declare that because now you can’t use mathematics any more, so you can’t use your pi-that-is-equal-to-three. Similarly, you can postulate the non-existence of morality, but it seems to be that now you can’t make conjectures about humans and how they might react, because they don’t work any more.
I suppose it comes down to reacting in the same way as Daniel Reeves and Caledonian: things aren’t like that, and they can’t be—the question doesn’t make sense to me.