OK, so if I’m understanding correctly your suggestion is that in order to reconstruct your mind it would be necessary to do lots of simulations of you-like minds in order to adjust the (unfathomably many) parameters to find a mind that behaves in the right ways. I concede that that might be so.
It’s an interesting (and disturbing) idea because it suggests that (little bits of?) our lives might be simulated billions of times, with small variations, in the process of trying to reconstruct us. (If, that is, anyone is so interested in reconstructing us at all.) This seems to me to make a big difference to the moral calculus of attempted simulated resurrection—“we can reconstruct your mind-state and put a new instantiation of it somewhere wonderful” sounds like quite a different deal from “we can reconstruct your mind-state and put a new instantiation of it somewhere wonderful—but the reconstruction process will involve billions of simulated minds that more or less closely resemble yours passing through good approximations to all the events of your life that we could find out about”, and I’d be much less happy about the latter.
I have to say that it seems unlikely that enough information exists to do the reconstruction for anyone—even people who save as much information about themselves as you do, which most of us don’t. I mean, in some sense maybe it’s still there since everything we do has effects on everything else in our future light cone, but I’d expect the information to be unusable in something like the way that energy becomes unusable when it turns into waste heat in rough thermal equilibrium with its surroundings.
Yes, there could be moral objections to such a process apart from its likeliness of success. And I agree that there is unlikely to be enough information for it to work in any case.
OK, so if I’m understanding correctly your suggestion is that in order to reconstruct your mind it would be necessary to do lots of simulations of you-like minds in order to adjust the (unfathomably many) parameters to find a mind that behaves in the right ways. I concede that that might be so.
It’s an interesting (and disturbing) idea because it suggests that (little bits of?) our lives might be simulated billions of times, with small variations, in the process of trying to reconstruct us. (If, that is, anyone is so interested in reconstructing us at all.) This seems to me to make a big difference to the moral calculus of attempted simulated resurrection—“we can reconstruct your mind-state and put a new instantiation of it somewhere wonderful” sounds like quite a different deal from “we can reconstruct your mind-state and put a new instantiation of it somewhere wonderful—but the reconstruction process will involve billions of simulated minds that more or less closely resemble yours passing through good approximations to all the events of your life that we could find out about”, and I’d be much less happy about the latter.
I have to say that it seems unlikely that enough information exists to do the reconstruction for anyone—even people who save as much information about themselves as you do, which most of us don’t. I mean, in some sense maybe it’s still there since everything we do has effects on everything else in our future light cone, but I’d expect the information to be unusable in something like the way that energy becomes unusable when it turns into waste heat in rough thermal equilibrium with its surroundings.
Yes, there could be moral objections to such a process apart from its likeliness of success. And I agree that there is unlikely to be enough information for it to work in any case.