Like with AGI, risks are a reason to be careful, but not a reason to give up indefinitely on doing it right. I think superintelligence is very likely to precede uploading (unfortunately), and so if humanity is allowed to survive, the risks of making technical mistakes with uploading won’t really be an issue.
I don’t see how this has anything to do with “succession” though, there is a world of difference between developing options and forcing them on people who don’t agree to take them.
Like with AGI, risks are a reason to be careful, but not a reason to give up indefinitely on doing it right. I think superintelligence is very likely to precede uploading (unfortunately), and so if humanity is allowed to survive, the risks of making technical mistakes with uploading won’t really be an issue.
I don’t see how this has anything to do with “succession” though, there is a world of difference between developing options and forcing them on people who don’t agree to take them.