I feel like this position is… flimsy? Unsubstantial? It’s not like I disagree, I don’t understand why you would want to articulate it in this way.
On the one hand, I don’t think biological/non-biological distinction is very meaningful from transhumanist perspective. Is embryo, genetically modified to have +9000IQ, going to be meaningfully considered “transhuman” instead of “posthuman”? Are you going to still be you after one billion years of life extension? “Keeping relevant features of you/humanity after enormous biological changes” seems to be qualitatively the same to “keeping relevant features of you/humanity after mind uploading”—i.e., if you know at gears-level what features of biological brains are essential to keep, you have rough understanding what you should work on in uploading.
On the other hand, I totally agree that if you don’t feel adventurous and you don’t want to save the world at price of your personality death, it would be a bad idea to undergo uploading in a way that closest-to-modern technology can provide. It just means that you need to wait for more technological progress. If we are in the ballpark of radical life extension, I don’t see any reason to not wait 50 years to perfect upload tech and I don’t see any reason why 50 years are not going to be enough, conditional on at least normally expected technical progress.
The same with AIs. If we have children, who are meaningfully different from us, and who can become even more different in glorious transhumanist future, I don’t see reasons to not have AI children, conditional on their designs preserving all important relevant features we want to see in our children. The problem is that we are not on track to create such designs, not conceptual existence of such designs.
And all said seems to be simply deducible/anticipated from concept of transhumanism, i.e., concept that the good future is the one filled with beings capable to meaningfully say that they were Homo Sapiens and stopped being Homo Sapiens at some point of their life. When you say “I want radical life extension” you immediately run into question “wait, am I going to be me after one billion years of life extension?” and you start The Way through all the questions about self-identity, essense of humanity, succession, et cetera.
I feel like this position is… flimsy? Unsubstantial? It’s not like I disagree, I don’t understand why you would want to articulate it in this way.
On the one hand, I don’t think biological/non-biological distinction is very meaningful from transhumanist perspective. Is embryo, genetically modified to have +9000IQ, going to be meaningfully considered “transhuman” instead of “posthuman”? Are you going to still be you after one billion years of life extension? “Keeping relevant features of you/humanity after enormous biological changes” seems to be qualitatively the same to “keeping relevant features of you/humanity after mind uploading”—i.e., if you know at gears-level what features of biological brains are essential to keep, you have rough understanding what you should work on in uploading.
On the other hand, I totally agree that if you don’t feel adventurous and you don’t want to save the world at price of your personality death, it would be a bad idea to undergo uploading in a way that closest-to-modern technology can provide. It just means that you need to wait for more technological progress. If we are in the ballpark of radical life extension, I don’t see any reason to not wait 50 years to perfect upload tech and I don’t see any reason why 50 years are not going to be enough, conditional on at least normally expected technical progress.
The same with AIs. If we have children, who are meaningfully different from us, and who can become even more different in glorious transhumanist future, I don’t see reasons to not have AI children, conditional on their designs preserving all important relevant features we want to see in our children. The problem is that we are not on track to create such designs, not conceptual existence of such designs.
And all said seems to be simply deducible/anticipated from concept of transhumanism, i.e., concept that the good future is the one filled with beings capable to meaningfully say that they were Homo Sapiens and stopped being Homo Sapiens at some point of their life. When you say “I want radical life extension” you immediately run into question “wait, am I going to be me after one billion years of life extension?” and you start The Way through all the questions about self-identity, essense of humanity, succession, et cetera.