One is that the hell exist in our simulation, and suicide is a sin :)
Pascal’s mugging. One could just as easily imagine a simulation such that suicide is necessary to be saved from hell. Which is more probable? We cannot say.
Another is that quantum immortality is true AND that you will survive any attempt of the suicide but seriously injured. Personally, I don’t think it is the tail outcome, but give it high probability, but most people give it the very low probability.
I also think this is more likely than not. Subjective Immortality doesn’t even require Many Worlds. A Tegmark I multiverse is sufficient. Assuming we have no immortal souls and our minds are only patterns in matter, then “you” are simultaneously every instantiation of your pattern throughout the multiverse. Attempting suicide will only force you into living only in the bad outcomes where you don’t have control over your life anymore, and thus cannot die. But this is exactly what the suicidal are trying to avoid.
Agreed about big world immortality. In the case of Pascal mugging, there are a lot of messages implanted in our culture that suicide is bad, so it increases chances that owners of simulations actually think so.
Also even if one is not signed for cryonics, but has rationalists friends, there is 0.01 percent chance of cryopreservation against his will, which dominates chances of other infinite regressions because of big world immortality. In other words, QI increases chances of your cryopreservation to almost 1.
See also my long comment below about acasual war between evil and benevolent AIs.
Pascal’s mugging. One could just as easily imagine a simulation such that suicide is necessary to be saved from hell. Which is more probable? We cannot say.
I also think this is more likely than not. Subjective Immortality doesn’t even require Many Worlds. A Tegmark I multiverse is sufficient. Assuming we have no immortal souls and our minds are only patterns in matter, then “you” are simultaneously every instantiation of your pattern throughout the multiverse. Attempting suicide will only force you into living only in the bad outcomes where you don’t have control over your life anymore, and thus cannot die. But this is exactly what the suicidal are trying to avoid.
Agreed about big world immortality. In the case of Pascal mugging, there are a lot of messages implanted in our culture that suicide is bad, so it increases chances that owners of simulations actually think so.
Also even if one is not signed for cryonics, but has rationalists friends, there is 0.01 percent chance of cryopreservation against his will, which dominates chances of other infinite regressions because of big world immortality. In other words, QI increases chances of your cryopreservation to almost 1.
See also my long comment below about acasual war between evil and benevolent AIs.