Recreating a person from haphazard data would almost certainly be impossible without a FAI, and with FAI you don’t need to make wishes, it knows better whether you should’ve wished something or not.
This is probably true, though there’s still the possibility of e.g. the FAI having low confidence of what I’d have wished, and the values of the population being such that it chooses to err on the side of not making copies when there’s uncertainty. Not the scenario I’d assign a terribly high probability on, but then explicitly giving permission also only took some minutes or so.
Your permission doesn’t answer the relevant question of whether it should reconstruct you, it only tells that you think it should (and, of course, it really shouldn’t, there are better alternatives).
of course, it really shouldn’t, there are better alternatives
I share this intuition as well and sometimes bring it up during discussions with SIAI people about cryonics. Can you explain your reasoning further? My arguments were like “All the resources an FAI would need to upload cryonics patients would be enough for years and years and years of simulated fun theory agents or whatever an FAI would use computronium for.”
In general I guess I just assume that post-Singularity computronium (if the FAI doesn’t just hack out of any matrixes it can) (and without considering acausal trade) would be used for things we can’t really anticipate; probably not lots of happy brain emulations. But others think this ‘identity’ thing is really important to humanity and we’re likely to hold onto it even through volition extrapolation. In response I’m often reminded of Wei’s ‘Complexity of Value != Complexity of Outcome’.
I antipredict that “people get reanimated”, but it doesn’t follow that preserving people’s minds using cryonics is morally irrelevant, or less so than the corresponding chance of saving human life. By preserving one’s mind, you give the future additional information that it can use to produce more value, even if that’s done not in the status quo manner (by reanimating).
Recreating a person from haphazard data would almost certainly be impossible without a FAI, and with FAI you don’t need to make wishes, it knows better whether you should’ve wished something or not.
This is probably true, though there’s still the possibility of e.g. the FAI having low confidence of what I’d have wished, and the values of the population being such that it chooses to err on the side of not making copies when there’s uncertainty. Not the scenario I’d assign a terribly high probability on, but then explicitly giving permission also only took some minutes or so.
Your permission doesn’t answer the relevant question of whether it should reconstruct you, it only tells that you think it should (and, of course, it really shouldn’t, there are better alternatives).
I share this intuition as well and sometimes bring it up during discussions with SIAI people about cryonics. Can you explain your reasoning further? My arguments were like “All the resources an FAI would need to upload cryonics patients would be enough for years and years and years of simulated fun theory agents or whatever an FAI would use computronium for.”
In general I guess I just assume that post-Singularity computronium (if the FAI doesn’t just hack out of any matrixes it can) (and without considering acausal trade) would be used for things we can’t really anticipate; probably not lots of happy brain emulations. But others think this ‘identity’ thing is really important to humanity and we’re likely to hold onto it even through volition extrapolation. In response I’m often reminded of Wei’s ‘Complexity of Value != Complexity of Outcome’.
What are your thoughts on the matter?
I antipredict that “people get reanimated”, but it doesn’t follow that preserving people’s minds using cryonics is morally irrelevant, or less so than the corresponding chance of saving human life. By preserving one’s mind, you give the future additional information that it can use to produce more value, even if that’s done not in the status quo manner (by reanimating).
Agreed, and thanks for sharing.