What makes it unserious? Is it that there are too many assumptions baked in to the scenario as described, so that it’s unlikely to match real challenges we will actually face?
I think it’s a problem for future people (and this is fairly technically difficult solution at that) and it doesn’t matter much whether we think about a plausible solution in advance. Whether future people solve this problem doesn’t look like it will have much shape on the overall sweep of history.
I think the problem is very likely to be resolved by different mechanisms based on trust and physical control rather than cryptography.
I think the slowdowns involved, even in a mature version of this idea, are likely impractical for the large majority of digital minds. So this isn’t a big deal morally during the singularity, and then after the singularity I don’t think this will be relevant.
What makes it unserious? Is it that there are too many assumptions baked in to the scenario as described, so that it’s unlikely to match real challenges we will actually face?
I think it’s a problem for future people (and this is fairly technically difficult solution at that) and it doesn’t matter much whether we think about a plausible solution in advance. Whether future people solve this problem doesn’t look like it will have much shape on the overall sweep of history.
I think the problem is very likely to be resolved by different mechanisms based on trust and physical control rather than cryptography.
I think the slowdowns involved, even in a mature version of this idea, are likely impractical for the large majority of digital minds. So this isn’t a big deal morally during the singularity, and then after the singularity I don’t think this will be relevant.
Makes sense, thanks!
Do you expect these mechanisms to also resolve the case where a biological human is forcibly uploaded in horrible conditions?