My personal strategy has been to not think about it very hard.
I am sufficiently fortunate that I can put a normal amount of funds into retirement, and I have continued to do so on the off chance that my colleagues and I succeed at preventing the emergence of AGI/ASI and the world remains mostly normal. I also don’t want to frighten my partner with my financial choices, and giving her peace of mind is worth quite a lot to me.
If superintelligence emerges and doesn’t kill everyone or worse, then I don’t have any strong preferences as to what my role is in the new social order, since I expect to be about as well-off as I am now or more.
My personal strategy has been to not think about it very hard.
I am sufficiently fortunate that I can put a normal amount of funds into retirement, and I have continued to do so on the off chance that my colleagues and I succeed at preventing the emergence of AGI/ASI and the world remains mostly normal. I also don’t want to frighten my partner with my financial choices, and giving her peace of mind is worth quite a lot to me.
If superintelligence emerges and doesn’t kill everyone or worse, then I don’t have any strong preferences as to what my role is in the new social order, since I expect to be about as well-off as I am now or more.