In total utilitarianism, it is a morally neutral act to kill someone (in a painless and unexpected manner) and creating/giving birth to another being of comparable happiness. In fact if one can kill a billion people to create a billion and one, one is morally compelled to do so. And this is true for real people, not just thought experiment people—living people with dreams, aspirations, grudges and annoying or endearing quirks.
Keep in mind that the people being brought into existence will be equally real people, with dreams, aspirations, grudges, and annoying or endearing quirks. If the people being killed had any more of what you value overall, then it wouldn’t be a utility neutral act.
Imagine that a billion people are annihilated from existence, and replaced with exact copies who’re indistinguishable in any way. Don’t judge a person’s plan to execute this, that could entail some sort of mistake, suppose that this simply happens, so we must judge it purely by its results. Do you think that this would be a bad thing?
If not, then presumably it’s not the destruction and replacement you’re objecting to in and of itself, you’re implicitly assuming a higher utility value for the people who’re destroyed than those who’re created, or some chance of an outcome other than perfect replacement of all the people with equal utility people.
Keep in mind that the people being brought into existence will be equally real people, with dreams, aspirations, grudges, and annoying or endearing quirks. If the people being killed had any more of what you value overall, then it wouldn’t be a utility neutral act.
Imagine that a billion people are annihilated from existence, and replaced with exact copies who’re indistinguishable in any way. Don’t judge a person’s plan to execute this, that could entail some sort of mistake, suppose that this simply happens, so we must judge it purely by its results. Do you think that this would be a bad thing?
If not, then presumably it’s not the destruction and replacement you’re objecting to in and of itself, you’re implicitly assuming a higher utility value for the people who’re destroyed than those who’re created, or some chance of an outcome other than perfect replacement of all the people with equal utility people.