There are a fair number of people who like the movie aesthetic of an underground bunker full of supplies, from which they will eventually emerge into the Fallout video game. In the context of AI risk, this is not remotely realistic. AI emergence scenarios mostly separate into worlds where everyone lives and worlds where everyone dies; near-misses are dystopias and weirdtopias, not most-people-die-but-not-everyone. So I don’t think there’s likely to be anything promising that would fall under the label “prepping”, as conventionally understood.
I meant prepping metaphorically, in the see of being willing to delve into the specifics of a scenario most other people would dismiss as unwinnable. The reason I posted this is that though it’s obvious that the bunker approach isn’t really the right one, I’m drawing a blank for what the right approach would even look like.
That being said, I figured into class of scenario might look identical to nuclear or biological war, only facilitated by AI. Are you saying scenarios where many but not all people die due to political/economic/environmental consequences of AI emergence are unlikely enough to disregard?
So let’s talk about dystopias/wierdtopias. Do you see any categories into which these can be grouped? The question then becomes, who will lose the most and who will lose the least under various types of scenarios.
I figured into class of scenario might look identical to nuclear or biological war, only facilitated by AI.
After the nuclear war caused by the AI, there’s likely still an unaligned AI out there. That AI is likely going to kill the survivors of the nuclear war.
There are a fair number of people who like the movie aesthetic of an underground bunker full of supplies, from which they will eventually emerge into the Fallout video game. In the context of AI risk, this is not remotely realistic. AI emergence scenarios mostly separate into worlds where everyone lives and worlds where everyone dies; near-misses are dystopias and weirdtopias, not most-people-die-but-not-everyone. So I don’t think there’s likely to be anything promising that would fall under the label “prepping”, as conventionally understood.
I meant prepping metaphorically, in the see of being willing to delve into the specifics of a scenario most other people would dismiss as unwinnable. The reason I posted this is that though it’s obvious that the bunker approach isn’t really the right one, I’m drawing a blank for what the right approach would even look like.
That being said, I figured into class of scenario might look identical to nuclear or biological war, only facilitated by AI. Are you saying scenarios where many but not all people die due to political/economic/environmental consequences of AI emergence are unlikely enough to disregard?
So let’s talk about dystopias/wierdtopias. Do you see any categories into which these can be grouped? The question then becomes, who will lose the most and who will lose the least under various types of scenarios.
After the nuclear war caused by the AI, there’s likely still an unaligned AI out there. That AI is likely going to kill the survivors of the nuclear war.