And if an external objective morality does say that the universe should occupy some horrifying state… let’s not even ask what you’re going to do about that. No, instead I ask: What would you have wished for the external objective morality to be instead? What’s the best news you could have gotten, reading that stone tablet?
Go ahead. Indulge your fantasy. Would you want the stone tablet to say people should die of old age, or that people should live as long as they wanted? If you could write the stone tablet yourself, what would it say?
Maybe you should just do that?
I mean… if an external objective morality tells you to kill people, why should you even listen?
No. Humans are not perfect moral reasoners; if he booted up a Friendly superintelligence and it mentioned black people, should a racist conclude he screwed up?
It’s unlikely, to my mind, that the arguments presented by the OP are correct—but they are not trivially false. There are people who already espouse similar views; hell, there are people who believe we should all commit suicide now, to spare future generations the pain of living. To simply say “I don’t want to do that” is a fully general counteragument, and if it was properly integrated you would be immune to any moral argument.
(You would also have to conclude that everyone but you has a wildly different utility function, which would rather defeat the purpose of CEV.)
http://lesswrong.com/lw/rr/the_moral_void/
For the lazy, the punchline:
No. Humans are not perfect moral reasoners; if he booted up a Friendly superintelligence and it mentioned black people, should a racist conclude he screwed up?
It’s unlikely, to my mind, that the arguments presented by the OP are correct—but they are not trivially false. There are people who already espouse similar views; hell, there are people who believe we should all commit suicide now, to spare future generations the pain of living. To simply say “I don’t want to do that” is a fully general counteragument, and if it was properly integrated you would be immune to any moral argument.
(You would also have to conclude that everyone but you has a wildly different utility function, which would rather defeat the purpose of CEV.)