My issue isn’t with cryonics, it is with the whole notion of self identity and post-singularity personhood. I guess this ties into EY’s ‘fun theory’ and underscores the importance of a working positive theory of ‘fun’ as a prerequisite for immortality as we currently define it.
Assume cryonics works, further more assume that your brain is scanned with enough resolution to capture all salient features of what you consider to be your mind. You are now an uploaded entity, and your mind is as malleable as any other piece of software. There are only so many clock cycles to spend indulging in hedonism and utopian bliss before that gets old. So then naturally, you expand your mind, you merge with other minds, and whatever else is possible. Very soon you will no longer resemble anything of what we consider a ‘self’, not just a human self with our evolved emotions and thoughts, but a ‘self’ in general as we can define it. So then, what is the point of trying to preserve yourself if you are going to transcend a self anyway?
I’d want to make sure humanity continues or ensure some posthuman eventuality continues where we leave off. Also, I’d want to continue to live as much of a fruitful existence for as long as I can, as long as I am still living. If that means I reach the singularity and enjoy a certain number of clock-cycles in utopia before the idea of utopia ceases to have any meaning, then lucky me. If not, I’m content to have existed at all and to have done my part in trying to ensure the continuation and spread of consciousness. But it makes no sense to go to such lengths to preserve myself needlessly in relation to such a (near) infinite expansion of consciousness.
This can also be viewed as an update to Camus’s question of suicide. Is not signing up for cryonics, or dying while knowing it can be overcome, similar to suicide? I admit, I have contemplated suicide in the past, but I’m over it. It pains me to start feeling the same emotions once again when contemplating cryonics.
Your view of cryonics seems similar to the concept of reincarnation.
Also, some would find personal value in setting up another entity for success. Namely, anyone who has children, but also anyone who argues for the good of the next generation. Essentially, it seems as if you are arguing against giving birth to a new you.
While that makes sense in terms of prolonging your own existence, it certainly seems to land on the “evil” side of selfish. Not that that is a problem… just interesting.
So you have one precise reason to not want to live on, and it hinges on quite a few assumptions right ? Adding details to a story makes it less probable.
Could you imagine a few other scenarios where things go right instead ? So long as you’re alive, there’s always at least the unexpected at any rate. You can’t predict what your future will be, especially post singularity. Even if you can’t imagine how your life could be pleasant or how to make it turn right now, since you aren’t expected to outsmart your future self, why wouldn’t it find that solution ?
I’ll list the most obvious thing to me here, if that can help, which is that I don’t see how expanding yourself equates with merging with other minds and loosing your individuality. If there was no way to get individual minds that are bigger than ours without a loss of individuality, then please do tell of how individual and complex humans are as compared to previous lifeforms, upstream the tree of life.
My issue isn’t with cryonics, it is with the whole notion of self identity and post-singularity personhood. I guess this ties into EY’s ‘fun theory’ and underscores the importance of a working positive theory of ‘fun’ as a prerequisite for immortality as we currently define it.
Assume cryonics works, further more assume that your brain is scanned with enough resolution to capture all salient features of what you consider to be your mind. You are now an uploaded entity, and your mind is as malleable as any other piece of software. There are only so many clock cycles to spend indulging in hedonism and utopian bliss before that gets old. So then naturally, you expand your mind, you merge with other minds, and whatever else is possible. Very soon you will no longer resemble anything of what we consider a ‘self’, not just a human self with our evolved emotions and thoughts, but a ‘self’ in general as we can define it. So then, what is the point of trying to preserve yourself if you are going to transcend a self anyway?
I’d want to make sure humanity continues or ensure some posthuman eventuality continues where we leave off. Also, I’d want to continue to live as much of a fruitful existence for as long as I can, as long as I am still living. If that means I reach the singularity and enjoy a certain number of clock-cycles in utopia before the idea of utopia ceases to have any meaning, then lucky me. If not, I’m content to have existed at all and to have done my part in trying to ensure the continuation and spread of consciousness. But it makes no sense to go to such lengths to preserve myself needlessly in relation to such a (near) infinite expansion of consciousness.
This can also be viewed as an update to Camus’s question of suicide. Is not signing up for cryonics, or dying while knowing it can be overcome, similar to suicide? I admit, I have contemplated suicide in the past, but I’m over it. It pains me to start feeling the same emotions once again when contemplating cryonics.
Your view of cryonics seems similar to the concept of reincarnation.
Also, some would find personal value in setting up another entity for success. Namely, anyone who has children, but also anyone who argues for the good of the next generation. Essentially, it seems as if you are arguing against giving birth to a new you.
While that makes sense in terms of prolonging your own existence, it certainly seems to land on the “evil” side of selfish. Not that that is a problem… just interesting.
So you have one precise reason to not want to live on, and it hinges on quite a few assumptions right ? Adding details to a story makes it less probable.
Could you imagine a few other scenarios where things go right instead ? So long as you’re alive, there’s always at least the unexpected at any rate. You can’t predict what your future will be, especially post singularity. Even if you can’t imagine how your life could be pleasant or how to make it turn right now, since you aren’t expected to outsmart your future self, why wouldn’t it find that solution ?
I’ll list the most obvious thing to me here, if that can help, which is that I don’t see how expanding yourself equates with merging with other minds and loosing your individuality. If there was no way to get individual minds that are bigger than ours without a loss of individuality, then please do tell of how individual and complex humans are as compared to previous lifeforms, upstream the tree of life.