I wouldn’t pass up on digital immortality, but personal survival matters less to me than collective survival. Even from a purely narcissistic standpoint, a human after another 1,000 years of cultural change has at least as much in common with me as a digital immortal 1,000 years later, even if the latter has continuity of consciousness with my present self.
I think it’s plausible that there are some variables that describe your essential computational properties and the way you self-actualize, that aren’t shared by anyone else.
(Also, consciousness is just a pattern-being-processed and it’s unclear if continuity of consciousness requires causal continuity. Imagine a robot that gets restored from a one-second-old backup. That pattern doesn’t have causal continuity with its self from a moment ago, but it looks like it’s more intuitive to see it as a one-second memory loss instead of death.)
That would lock us away from digital immortality forever. (Edit: Well, not necessarily. But I would be worried about that.)
I wouldn’t pass up on digital immortality, but personal survival matters less to me than collective survival. Even from a purely narcissistic standpoint, a human after another 1,000 years of cultural change has at least as much in common with me as a digital immortal 1,000 years later, even if the latter has continuity of consciousness with my present self.
I think it’s plausible that there are some variables that describe your essential computational properties and the way you self-actualize, that aren’t shared by anyone else.
(Also, consciousness is just a pattern-being-processed and it’s unclear if continuity of consciousness requires causal continuity. Imagine a robot that gets restored from a one-second-old backup. That pattern doesn’t have causal continuity with its self from a moment ago, but it looks like it’s more intuitive to see it as a one-second memory loss instead of death.)