Check the comments Yudkowsky is responding to on Twitter:
Ok, I hear you, but I really want to live forever. And the way I see it is:
Chances of AGI not killing us and helping us cure aging and disease: small.
Chances of us curing aging and disease without AGI within our lifetime: even smaller.
And:
For every day AGI is delayed, there occurs an immense amount of pain and death that could have been prevented by AGI abundance.
Anyone who unnecessarily delays AI progress has an enormous amount of blood on their hands.
Cryonics can have a symbolism of “I really want to live forever” or “every death is blood on our hands” that is very compatible with racing to AGI.
(I agree with all your disclaimers about symbolic action)
Check the comments Yudkowsky is responding to on Twitter:
And:
Cryonics can have a symbolism of “I really want to live forever” or “every death is blood on our hands” that is very compatible with racing to AGI.
(I agree with all your disclaimers about symbolic action)
Good point… Still unsure, I suspect it would still tilt people toward not having the missing mood about AGI x-risk.