Sorry, but my moral valuations aren’t up for grabs. I’m not perfectly selfish, but neither am I perfectly altruistic; I care more about the welfare of agents more like me, and particularly about the welfare of agents who happen to remember having been me. That valuation has been drummed into my brain pretty thoroughly by evolution, and it may well survive in any extrapolation.
But at this point, I think we’ve passed the productive stage of this particular discussion.
Sorry, but my moral valuations aren’t up for grabs. I’m not perfectly selfish, but neither am I perfectly altruistic; I care more about the welfare of agents more like me, and particularly about the welfare of agents who happen to remember having been me. That valuation has been drummed into my brain pretty thoroughly by evolution, and it may well survive in any extrapolation.
But at this point, I think we’ve passed the productive stage of this particular discussion.