There’s no reason for me to think that my personal preferences (e.g. that my descendants exist) are related to the “right thing to do”, and so there’s no reason for me to think that optimizing the world for the “right things” will fulfil my preference.
This, and several of the passages in your original post such as, “I agree such a definition of moral value would be hard to justify,” seem to imply some assumption of moral realism that I sometimes encounter as well, but have never really found convincing arguments for. I would say that the successionists you’re talking to are making a category error, and I would not much trust their understanding of ‘should’-ness outside normal day-to-day contexts.
I can’t really imagine a scenario where I “should” or would be ok with currently-existing-humans going extinct, though that doesn’t mean none could exist. I can, however, imagine a future where humanity chooses to cease (most?) natural biological reproduction in favor of other methods of bringing new life into the world, whether biological or artificial, which I could endorse (especially if we become biologically or otherwise immortal as individuals). I can further imagine being ok with those remaining biological humans each changing (gradually or suddenly) various aspects of their bodies, their minds, and the substrates their minds run on, until they are no longer meat-based and/or no longer ‘human’ in various ways most people currently understand the term.
This, and several of the passages in your original post such as, “I agree such a definition of moral value would be hard to justify,” seem to imply some assumption of moral realism that I sometimes encounter as well, but have never really found convincing arguments for. I would say that the successionists you’re talking to are making a category error, and I would not much trust their understanding of ‘should’-ness outside normal day-to-day contexts.
I broadly agree.
I am indeed being a bit sloppy with the moral language in my post. What I mean to say is something like “insofar as you’re trying to describe a moral realist position with a utility function to be optimized for, it’d be hard to justify valuing your specific likeness”.
In a similar fashion, I prefer and value my family more than your family but it’d be weird for me to say that you also should prefer my family to your own family.
However, I expect our interests and preferences to align when it comes to preferring that we have the right to prefer our own families, or preferring that our species exists.
(Meta: I am extremely far from an expert on moral philosophy or philosophy in general, I do aspire to improve how rigorously I am able to articulate my positions.)
This, and several of the passages in your original post such as, “I agree such a definition of moral value would be hard to justify,” seem to imply some assumption of moral realism that I sometimes encounter as well, but have never really found convincing arguments for. I would say that the successionists you’re talking to are making a category error, and I would not much trust their understanding of ‘should’-ness outside normal day-to-day contexts.
In other words: it sounds like you don’t want to be replaced under any conditions you can foresee.. You have judged. What else is there?
I can’t really imagine a scenario where I “should” or would be ok with currently-existing-humans going extinct, though that doesn’t mean none could exist. I can, however, imagine a future where humanity chooses to cease (most?) natural biological reproduction in favor of other methods of bringing new life into the world, whether biological or artificial, which I could endorse (especially if we become biologically or otherwise immortal as individuals). I can further imagine being ok with those remaining biological humans each changing (gradually or suddenly) various aspects of their bodies, their minds, and the substrates their minds run on, until they are no longer meat-based and/or no longer ‘human’ in various ways most people currently understand the term.
I broadly agree.
I am indeed being a bit sloppy with the moral language in my post. What I mean to say is something like “insofar as you’re trying to describe a moral realist position with a utility function to be optimized for, it’d be hard to justify valuing your specific likeness”.
In a similar fashion, I prefer and value my family more than your family but it’d be weird for me to say that you also should prefer my family to your own family.
However, I expect our interests and preferences to align when it comes to preferring that we have the right to prefer our own families, or preferring that our species exists.
(Meta: I am extremely far from an expert on moral philosophy or philosophy in general, I do aspire to improve how rigorously I am able to articulate my positions.)