Seems ~correct[1] under selfish preferences, but if caring broadly about humanlike life (even just your kids or kids kids etc) this changes dramatically (as value could be realized later). Plus the possibility of resurrection via Cryonics or digital data reconstructions. Also, my estimates of progress and doom are well towards the end where long pauses are called for even just selfishly.
Somewhat worried this shifts people towards incorrectly rushing ahead due to missing some of the above parts of world-model, but respect the depth and detail of analysis.
Besides cryonics, if you can “revive” someone by figuring out the exact parameters of our universe, simulating history, and thusly finding the people who died and saving them, this changes the calculus further, as now we need to consider extending the lifespan of everyone who ever lived.
Mind uploading, besides directly increasing lifespan, could also result in copies and branches of selves. I find it extremely difficult to reason about the utility of having N copies of myself.
It’s unclear to me if this possible, but there’s at least uncertainty for me here, when we talk about superintelligence.
Seems ~correct[1] under selfish preferences, but if caring broadly about humanlike life (even just your kids or kids kids etc) this changes dramatically (as value could be realized later). Plus the possibility of resurrection via Cryonics or digital data reconstructions. Also, my estimates of progress and doom are well towards the end where long pauses are called for even just selfishly.
Somewhat worried this shifts people towards incorrectly rushing ahead due to missing some of the above parts of world-model, but respect the depth and detail of analysis.
oh other than 1400y seems pretty arbitrary, i’d mostly guess that post AGI life is vastly longer than that?
Besides cryonics, if you can “revive” someone by figuring out the exact parameters of our universe, simulating history, and thusly finding the people who died and saving them, this changes the calculus further, as now we need to consider extending the lifespan of everyone who ever lived.
Mind uploading, besides directly increasing lifespan, could also result in copies and branches of selves. I find it extremely difficult to reason about the utility of having N copies of myself.
It’s unclear to me if this possible, but there’s at least uncertainty for me here, when we talk about superintelligence.