IMO the whole “upload” thing changes drastically depending on our understanding of consciousness and continuity of the self (which is currently nearly non-existent). It’s like teleportation—I would let neither that nor upload happen to me willingly unless someone was able to convincingly explain me how precisely are my qualia associated with my brain and how they’re going to move over (rather than just killing me and creating a different entity).
I don’t believe it’s impossible for an upload to be “me”. But I doubt it’d be as easy as simply making a scan of my synapses and calling it a day. If it is, and if that “me” is then also infinitely copiable, I’d be very ambivalent about it (given all the possible ways it could go horribly wrong—see this story or the recent animated show Pantheon for ideas).
So it’s definitely a “ok, but” position for me. Would probably feel more comfortable with a “replace my brain bit by bit with artificial functional equivalents” scenario as one that preserves genuine continuity of self.
I think a big reason why uploads may be much worse than regular life is not that the brain scan will be not good enough but that they won’t be able to interact with the real world like you can as a physical human.
Edit: I guess with sufficiently good robotics the ems would be able to interact with the same physical world as us in which case I would be much less worried.
I’d say even simply a simulated physical environment could be good enough to be indistinguishable. As Morpheus put it:
What is real? How do you define ‘real’? If you’re talking about what you can feel, what you can smell, what you can taste and see, then ‘real’ is simply electrical signals interpreted by your brain.
Of course, that would require insane amounts of compute, but so would a brain upload in the first place anyway.
IMO the whole “upload” thing changes drastically depending on our understanding of consciousness and continuity of the self (which is currently nearly non-existent). It’s like teleportation—I would let neither that nor upload happen to me willingly unless someone was able to convincingly explain me how precisely are my qualia associated with my brain and how they’re going to move over (rather than just killing me and creating a different entity).
I don’t believe it’s impossible for an upload to be “me”. But I doubt it’d be as easy as simply making a scan of my synapses and calling it a day. If it is, and if that “me” is then also infinitely copiable, I’d be very ambivalent about it (given all the possible ways it could go horribly wrong—see this story or the recent animated show Pantheon for ideas).
So it’s definitely a “ok, but” position for me. Would probably feel more comfortable with a “replace my brain bit by bit with artificial functional equivalents” scenario as one that preserves genuine continuity of self.
I think a big reason why uploads may be much worse than regular life is not that the brain scan will be not good enough but that they won’t be able to interact with the real world like you can as a physical human.
Edit: I guess with sufficiently good robotics the ems would be able to interact with the same physical world as us in which case I would be much less worried.
I’d say even simply a simulated physical environment could be good enough to be indistinguishable. As Morpheus put it:
Of course, that would require insane amounts of compute, but so would a brain upload in the first place anyway.