At a moderate P(doom), say under 25%, from a selfish perspective it makes sense to accelerate AI if it increases the chance that you get to live forever, even if it increases your risk of dying. I have heard from some people that this is their motivation.
If this is you: Please just sign up for cryonics. It’s a much better immortality gambit than rushing for ASI.
This seems not to be true assuming a P(doom) of 25% and a purely selfish perspective, or even a moderately altruistic perspective which places most of its weight on, say, the person’s immediate family and friends.
Of course any cryonics-free strategy is probably dominated by that same strategy plus cryonics for a personal bet at immortality, but when it comes to friends and family it’s not easy to convince people to sign up for cryonics! But immortality-maxxing for one’s friends and family almost definitely entails accelerating AI even at pretty high P(doom)
(And that’s without saying that this is very likely to not be the true reason for these people’s actions. It’s far more likely to be local-perceived-status-gradient-climbing followed by a post-hoc rationalization (which can also be understood as a form of local-perceived-status-gradient-climbing) and signing up for cryonics doesn’t really get you any status outside of the deepest depths of the rat-sphere, which people like this are obviously not in since they’re gaining status from accelerating AI)
If this is you: Please just sign up for cryonics. It’s a much better immortality gambit than rushing for ASI.
This seems not to be true assuming a P(doom) of 25% and a purely selfish perspective, or even a moderately altruistic perspective which places most of its weight on, say, the person’s immediate family and friends.
Of course any cryonics-free strategy is probably dominated by that same strategy plus cryonics for a personal bet at immortality, but when it comes to friends and family it’s not easy to convince people to sign up for cryonics! But immortality-maxxing for one’s friends and family almost definitely entails accelerating AI even at pretty high P(doom)
(And that’s without saying that this is very likely to not be the true reason for these people’s actions. It’s far more likely to be local-perceived-status-gradient-climbing followed by a post-hoc rationalization (which can also be understood as a form of local-perceived-status-gradient-climbing) and signing up for cryonics doesn’t really get you any status outside of the deepest depths of the rat-sphere, which people like this are obviously not in since they’re gaining status from accelerating AI)