If MAD isn’t a concern in using a given weapon, it doesn’t sound like much of an existential threat.
I dont understand the logic of this sentence. If I create an Earth-destroying bomb in my basement, MAD doesn’t apply but it’s still an existential threat. Similar reasoning works for nanotech, biotech and AI.
If MAD isn’t a concern in using a given weapon, it doesn’t sound like much of an existential threat.
I dont understand the logic of this sentence. If I create an Earth-destroying bomb in my basement, MAD doesn’t apply but it’s still an existential threat. Similar reasoning works for nanotech, biotech and AI.