The moral worth resides in each individual, since they have a subjective experience of the world, while a collective like “ants” does not. So doubling the ant population is twice as good.
Wouldn’t the hive need to have a subjective experience—collectively or as individuals—for it to be good to double their population in your example?
Whether they’re presently conscious or not, I wouldn’t want to bring ant-suffering into the world if I could avoid it. On the other hand, I do not interfere with them and it’s good to see them doing well in some places.
As for your five mentions of “utilitarianism.” I try to convey my view in the plainest terms. I do not mean to offend you or any -isms or -ologies of philosophy. I like reason and am here to learn what I can. Utilitarians are all friends to me.
I think ethics is just a matter of preference
I’m fine with that framing too. There are a lot of good preferences found commonly among sentient beings. Happiness is better than suffering precisely to the extent of preferences, i.e. ethics.
The reason why it’s considered good to double the ant population is not necessarily because it’ll be good for the existing ants, it’s because it’ll be good for the new ants created. Likewise, the reason why it’ll be good to create copies of yourself is not because you will be happy, but because your copies will be happy, which is also a good thing.
Yes, it requires the ants to have subjective experience for making more of them to be good in utilitarianism, because utilitarianism only values subjective experiences. Though, if your model of the world says that ant suffering is bad, then doesn’t that imply that you believe ants have subjective experience?
if your model of the world says that ant suffering is bad, then doesn’t that imply that you believe ants have subjective experience?
Indeed. I was questioning the proposition by Seth Herd that a collective like ants does not have subjective experience and so “doubling the ant population is twice as good.” I didn’t follow that line of reasoning and wondered whether it might be a mistake.
Likewise, the reason why it’ll be good to create copies of yourself is not because you will be happy, but because your copies will be happy
I don’t think creating a copy of myself is possible without repeating at least the amount of suffering I have experienced. My copies would be happy, but so too would they suffer. I would opt out of the creation of unnecessary suffering. (Aside: I am canceling my cryopreservation plans after more than 15 years of Alcor membership.)
Likewise, injury, aging and death are perhaps not the only causes of suffering in ants. Birth could be suffering for them too.
We do agree that suffering is bad, and that if a new clone of you would experience more suffering than happiness, then it’ll be bad, but does the suffering really outweigh the happiness they’ll gain?
You have experienced suffering in your life. But still, do you prefer to have lived, or do you prefer to not have been born? Your copy will probably give the same answer.
(If your answer is genuinely “I wish I wasn’t born”, then I can understand not wanting to have copies of yourself)
One life like mine, that has experienced limited suffering and boundless happiness, is enough. Spinning up too many of these results in boundless suffering. I would not put this life on repeat, unlearning and relearning every lesson for eternity.
Wouldn’t the hive need to have a subjective experience—collectively or as individuals—for it to be good to double their population in your example?
Whether they’re presently conscious or not, I wouldn’t want to bring ant-suffering into the world if I could avoid it. On the other hand, I do not interfere with them and it’s good to see them doing well in some places.
As for your five mentions of “utilitarianism.” I try to convey my view in the plainest terms. I do not mean to offend you or any -isms or -ologies of philosophy. I like reason and am here to learn what I can. Utilitarians are all friends to me.
I’m fine with that framing too. There are a lot of good preferences found commonly among sentient beings. Happiness is better than suffering precisely to the extent of preferences, i.e. ethics.
The reason why it’s considered good to double the ant population is not necessarily because it’ll be good for the existing ants, it’s because it’ll be good for the new ants created. Likewise, the reason why it’ll be good to create copies of yourself is not because you will be happy, but because your copies will be happy, which is also a good thing.
Yes, it requires the ants to have subjective experience for making more of them to be good in utilitarianism, because utilitarianism only values subjective experiences. Though, if your model of the world says that ant suffering is bad, then doesn’t that imply that you believe ants have subjective experience?
Indeed. I was questioning the proposition by Seth Herd that a collective like ants does not have subjective experience and so “doubling the ant population is twice as good.” I didn’t follow that line of reasoning and wondered whether it might be a mistake.
I don’t think creating a copy of myself is possible without repeating at least the amount of suffering I have experienced. My copies would be happy, but so too would they suffer. I would opt out of the creation of unnecessary suffering. (Aside: I am canceling my cryopreservation plans after more than 15 years of Alcor membership.)
Likewise, injury, aging and death are perhaps not the only causes of suffering in ants. Birth could be suffering for them too.
We do agree that suffering is bad, and that if a new clone of you would experience more suffering than happiness, then it’ll be bad, but does the suffering really outweigh the happiness they’ll gain?
You have experienced suffering in your life. But still, do you prefer to have lived, or do you prefer to not have been born? Your copy will probably give the same answer.
(If your answer is genuinely “I wish I wasn’t born”, then I can understand not wanting to have copies of yourself)
One life like mine, that has experienced limited suffering and boundless happiness, is enough. Spinning up too many of these results in boundless suffering. I would not put this life on repeat, unlearning and relearning every lesson for eternity.