Would you take the gamble, or would you choose non-existence?
Non-existence is a gamble too. You could lose out on billions of years of happiness! Even without that opportunity cost, I assert that most humans’ ability to integrate over large timespans is missing, and you’re going to get answers that are hard to reconcile with any sane definitions and preferences that don’t ALREADY lead most people to suicide.
For me, sign me up for immortality if it’s not pretty certain to be torture.
To say that non-existence is a gamble too is kind of like saying that a person who does not gamble in a casino is gambling too—because they are missing on a chance to win millions of dollars—to me that is more a matter of definitions and if one wants to argue for that, sure, let’s accept that every single thing in life is a gamble.
Your assertion that humans will be able to integrate over large timespans might be true given the current human brain—but here we are talking about superintelligence, even with relatively primitive AIs we are already talking about new medication and cures, superintelligence that would want to cause widespread suffering or torture you and be able to build a Dyson sphere around the sun and a thousand of other advanced technologies will be able not just to figure out how to torture you persistently (so that your brain does not adapt to the new state of constant torture) but also to increase your pain levels by 1000x—not all animals feels the same pain, and there is no reason to think that current pain experience of humans cannot be increased by a huge amount.
I don’t think that it is not rational to take the gamble when the odds are 1%, much less when the odds are 20% or 49% or 70%. Let’s go with 1% because I am willing to give you favourable odds—so the post asks, would you be willing to be in a torture chamber now 1 hour for every 99 hours that you are in a really happy state? We can increase that to 20 hours (20%) or what have you. And here I am talking about real extreme torture, not you have a headache. So imagine the worst torture methods that currently exist, and it is not waterboarding—check worst torture methods of history and if you are objective, whatever odds you would be willing to accept, if you say 20% or 1%, would you be willing to be really tortured for that amount of time single every day?
On further reflection, I realize I’m assuming a fair bit of hyperbole in the setup. I just don’t believe there’s more than an infinitesimal chance of actual perpetual torture, and my mind substitutes dust motes in one’s eye.
I don’t think any amount of discussion is likely to get my mind into a state that takes it seriously enough to actually engage on that level, so I’m bowing out. Thanks for the discussion!
Non-existence is a gamble too. You could lose out on billions of years of happiness! Even without that opportunity cost, I assert that most humans’ ability to integrate over large timespans is missing, and you’re going to get answers that are hard to reconcile with any sane definitions and preferences that don’t ALREADY lead most people to suicide.
For me, sign me up for immortality if it’s not pretty certain to be torture.
To say that non-existence is a gamble too is kind of like saying that a person who does not gamble in a casino is gambling too—because they are missing on a chance to win millions of dollars—to me that is more a matter of definitions and if one wants to argue for that, sure, let’s accept that every single thing in life is a gamble.
Your assertion that humans will be able to integrate over large timespans might be true given the current human brain—but here we are talking about superintelligence, even with relatively primitive AIs we are already talking about new medication and cures, superintelligence that would want to cause widespread suffering or torture you and be able to build a Dyson sphere around the sun and a thousand of other advanced technologies will be able not just to figure out how to torture you persistently (so that your brain does not adapt to the new state of constant torture) but also to increase your pain levels by 1000x—not all animals feels the same pain, and there is no reason to think that current pain experience of humans cannot be increased by a huge amount.
I don’t think that it is not rational to take the gamble when the odds are 1%, much less when the odds are 20% or 49% or 70%. Let’s go with 1% because I am willing to give you favourable odds—so the post asks, would you be willing to be in a torture chamber now 1 hour for every 99 hours that you are in a really happy state? We can increase that to 20 hours (20%) or what have you. And here I am talking about real extreme torture, not you have a headache. So imagine the worst torture methods that currently exist, and it is not waterboarding—check worst torture methods of history and if you are objective, whatever odds you would be willing to accept, if you say 20% or 1%, would you be willing to be really tortured for that amount of time single every day?
On further reflection, I realize I’m assuming a fair bit of hyperbole in the setup. I just don’t believe there’s more than an infinitesimal chance of actual perpetual torture, and my mind substitutes dust motes in one’s eye.
I don’t think any amount of discussion is likely to get my mind into a state that takes it seriously enough to actually engage on that level, so I’m bowing out. Thanks for the discussion!