OMEGA: If you pay me just one penny, I’ll replace your 80% chance of living for 10^(10^10) years, with a 79.99992% chance of living 10^(10^(10^10)) years
HUMAN: That sounds like an awful lot of time. Would you mind to write it as a decimal number
OMEGA: Here it is… Of course don’t expect to read this number in less than 10 ^ 9999999990 years.
HUMAN: Nevermind… So It’s such a mind boggling amount of time. If I would get bored or otherwise distressed, loose my lust for life. Am I allowed to kill myself?
OMEGA: Not really. If I’d allow that and assuming that the probability of killing yourself would be 0.000000001 in 10^10 years, then it would be almost sure that you kill yourself by the end of 10^(10^(10^10)) years
HUMAN: This sounds depressing. So my decision has the potential to confining me to grillions of years of suffering, if I’d lost my lust for life.
OMEGA: OK, I see your point, I also offer you some additional drugs to make you happy whenever you would have any distress. I also promise you to modify your brain that you will never even wish to kill yourself during these few eons.
HUMAN: Sounds great,, but I also enjoy your company very much, can I hope you to entertain me from time to time with bets like this?
OMEGA: Yes, BUT: you will never be able to enter a bet again that has a nonzero chance of survival. Otherwise you will enter too many bets over your lifetime so that it would become inevitable that you die long before the end of the lifetime.
HUMAN: Hmmm, interesting… I think I could still live with that. In this case I take your offer.
OMEGA: But wait! There’s another special offer, and you won’t even have to pay a penny for this one—this one is free! That’s right, I’m offering to exponentiate your lifespan again, to 10^(10^(10^10,000,000,000)) years! Now, I’ll have to multiply your probability of survival by 99.9999% again, but really, what’s that compared to the nigh-incomprehensible increase in your expected lifespan?
HUMAN: Interesting. However that sounds like a bit irrealistic to me. How long do we know each other?
OMEGA: Why do you ask?
HUMAN: I already determined to agree to your original offer. Loosing those eons even at 0.0001% chance sounds quite risky to me. I just want to check whether I can trust you enough to expect that risk to pay off?
OMEGA: What do you mean?
HUMAN: I need some Bayesian evidence that you are really capable of keeping your promise.
OMEGA: What do you expect me to do?
HUMAN: I am in a generous mood. I think 0.000001% of that 10^(10^(10^10))) years of supporting me, answering my questions, providing company would be enough to convince me that you mean your offer seriously enough such that it is worth that 0.0001% risk of loosing the remaining 99.999999% of those eons.
OMEGA: Nice try tricking me. If I granted your wish, you’d get 10^(10^999999990) years with a potential prolongation to 10^(10^(10^10)) years.
HUMAN: Come on, I just try to be rational, how could I trust you without enough Bayesian evidence? I’d risk gazillions of eons, for a dubious payoff. That requires extraordinary amount of evidence.
OMEGA: This stinks. I don’t do anything like that. Either you trust me 100% or not. Your choice.
HUMAN: No problem. Even with 100% certainty in your capabilities, I am not 100% sure about mine. You see: I’ve got only those puny neurons and only 10^11 of them and you wish me to make an educated choice on 10^(10^(10000000000)) years? We have already found several caveats: possibility suicide, unhappiness, bayesian evidence etc. that had to be resolved. How can I be sure that that’s all? If there is some other catch that I could not think of and then I’ll have to live with my decision for 10^(10^(1000000000)) years? Certainly the probability of overlooking something that would spoil those years is high enough not to risk taking the offer. Anyways what is the probability that you come up with an even more convincing offer during a period of 10^1000000000 years?
OMEGA: OK, I see your point, I also offer you some additional drugs to make you happy whenever you would have any distress. I also promise you to modify your brain that you will never even wish to kill yourself during these few eons.
Drugs? You don’t need drugs. You just need FUN! Hey, there’s a reason why I wrote that, you know.
“Drug” was just a catchy phrase for omega’s guarantee to cure you out from any psychological issues the could cause you any prolonged distress.
You could insist that it is entirely impossible that you’d need it.
Would not it be a bit overconfident to make any statements on what is possible to some insanely complex and alien future self of you over a period of time which is measured by a number (in years) that takes billion to the power of billions of your current lifetime just to read?
OMEGA: Not really. If I’d allow that and assuming that the probability of killing yourself would be 0.000000001 in 10^10 years, then it would be almost sure that you kill yourself by the end of 10^(10^(10^10)) years
With even very slowly growing estimates p(suicide in t years) = log ( log … ( log (log t))) would give the human enough incentives refuse the offer at some point (after accepting some) without an extra guarantee of not dieing earlier due to suicide.
Therefore, at that point omega will have to make this offer if he wants to convince the human.
and not the point anyways. The point is that there are too many unclear points and one can come up with a lot of questions that were not specified in the OP. For example: it is not even clear whether you die with 100% certainty once your agreed upon lifetime expires or is there still a chance that some other offer comes by? etc. Your estimted probability of suicide, omega’s guarantee on that, guarantees on the quality of life, bayesian evidence on Omega, etc. These are all factors that could influence the decision,…
And once one realizes that these were all there, hidden, doubts would arise that whether a human mind should at all attempt to make such high stake decisions based on so little evidence for so much ahead in time.
Actually, by the terms of the first bet, you are PREVENTED from taking the second. You’re not allowed to suicide or take any action with nonzero chance of death.
OMEGA: If you pay me just one penny, I’ll replace your 80% chance of living for 10^(10^10) years, with a 79.99992% chance of living 10^(10^(10^10)) years
HUMAN: That sounds like an awful lot of time. Would you mind to write it as a decimal number
OMEGA: Here it is… Of course don’t expect to read this number in less than 10 ^ 9999999990 years.
HUMAN: Nevermind… So It’s such a mind boggling amount of time. If I would get bored or otherwise distressed, loose my lust for life. Am I allowed to kill myself?
OMEGA: Not really. If I’d allow that and assuming that the probability of killing yourself would be 0.000000001 in 10^10 years, then it would be almost sure that you kill yourself by the end of 10^(10^(10^10)) years
HUMAN: This sounds depressing. So my decision has the potential to confining me to grillions of years of suffering, if I’d lost my lust for life.
OMEGA: OK, I see your point, I also offer you some additional drugs to make you happy whenever you would have any distress. I also promise you to modify your brain that you will never even wish to kill yourself during these few eons.
HUMAN: Sounds great,, but I also enjoy your company very much, can I hope you to entertain me from time to time with bets like this?
OMEGA: Yes, BUT: you will never be able to enter a bet again that has a nonzero chance of survival. Otherwise you will enter too many bets over your lifetime so that it would become inevitable that you die long before the end of the lifetime.
HUMAN: Hmmm, interesting… I think I could still live with that. In this case I take your offer.
OMEGA: But wait! There’s another special offer, and you won’t even have to pay a penny for this one—this one is free! That’s right, I’m offering to exponentiate your lifespan again, to 10^(10^(10^10,000,000,000)) years! Now, I’ll have to multiply your probability of survival by 99.9999% again, but really, what’s that compared to the nigh-incomprehensible increase in your expected lifespan?
HUMAN: Interesting. However that sounds like a bit irrealistic to me. How long do we know each other?
OMEGA: Why do you ask?
HUMAN: I already determined to agree to your original offer. Loosing those eons even at 0.0001% chance sounds quite risky to me. I just want to check whether I can trust you enough to expect that risk to pay off?
OMEGA: What do you mean?
HUMAN: I need some Bayesian evidence that you are really capable of keeping your promise.
OMEGA: What do you expect me to do?
HUMAN: I am in a generous mood. I think 0.000001% of that 10^(10^(10^10))) years of supporting me, answering my questions, providing company would be enough to convince me that you mean your offer seriously enough such that it is worth that 0.0001% risk of loosing the remaining 99.999999% of those eons.
OMEGA: Nice try tricking me. If I granted your wish, you’d get 10^(10^999999990) years with a potential prolongation to 10^(10^(10^10)) years.
HUMAN: Come on, I just try to be rational, how could I trust you without enough Bayesian evidence? I’d risk gazillions of eons, for a dubious payoff. That requires extraordinary amount of evidence.
OMEGA: This stinks. I don’t do anything like that. Either you trust me 100% or not. Your choice.
HUMAN: No problem. Even with 100% certainty in your capabilities, I am not 100% sure about mine. You see: I’ve got only those puny neurons and only 10^11 of them and you wish me to make an educated choice on 10^(10^(10000000000)) years? We have already found several caveats: possibility suicide, unhappiness, bayesian evidence etc. that had to be resolved. How can I be sure that that’s all? If there is some other catch that I could not think of and then I’ll have to live with my decision for 10^(10^(1000000000)) years? Certainly the probability of overlooking something that would spoil those years is high enough not to risk taking the offer. Anyways what is the probability that you come up with an even more convincing offer during a period of 10^1000000000 years?
Drugs? You don’t need drugs. You just need FUN! Hey, there’s a reason why I wrote that, you know.
“Drug” was just a catchy phrase for omega’s guarantee to cure you out from any psychological issues the could cause you any prolonged distress.
You could insist that it is entirely impossible that you’d need it.
Would not it be a bit overconfident to make any statements on what is possible to some insanely complex and alien future self of you over a period of time which is measured by a number (in years) that takes billion to the power of billions of your current lifetime just to read?
Assuming independence, which is unreasonable.
With even very slowly growing estimates p(suicide in t years) = log ( log … ( log (log t))) would give the human enough incentives refuse the offer at some point (after accepting some) without an extra guarantee of not dieing earlier due to suicide.
Therefore, at that point omega will have to make this offer if he wants to convince the human.
The limit as t->infinity of p(suicide in t years) is probably considerably less than 1; I think that averts your concern.
This is highly subjective...
and not the point anyways. The point is that there are too many unclear points and one can come up with a lot of questions that were not specified in the OP. For example: it is not even clear whether you die with 100% certainty once your agreed upon lifetime expires or is there still a chance that some other offer comes by? etc. Your estimted probability of suicide, omega’s guarantee on that, guarantees on the quality of life, bayesian evidence on Omega, etc. These are all factors that could influence the decision,…
And once one realizes that these were all there, hidden, doubts would arise that whether a human mind should at all attempt to make such high stake decisions based on so little evidence for so much ahead in time.
Actually, by the terms of the first bet, you are PREVENTED from taking the second. You’re not allowed to suicide or take any action with nonzero chance of death.