1: The methods for constructing a nuclear bomb are by no means “widely known and easy to follow.” Witness the often unsuccessful struggles of many nations for decades to acquire them. Cobalt bombs are even more advanced and difficult to construct than ‘regular’ nuclear weapons.
The scenario RobBB was presumably envisioning was one in which private individuals have gained the ability to essentially destroy society using, e.g. a super-pandemic or something. A sufficient number of people have always been able to destroy human society; no new technology would be needed for everybody in the world to simultaneously commit suicide, or, for that matter, for a massive nuclear exchange. Spontaneous collective suicide is not likely. However, actions by a small group of individuals (c.f. al-Qaeda) are far more likely. Such groups do not at present have the ability to end human society as we know it; RobBB is envisioning a scenario where they gain it.
2: In the above comment, I wasn’t talking about existential risks [*]; I am not claiming that a nuclear war would be an existential risk. For any conventional nuclear war of significant size, SF/Berkley would almost certainly be targeted, killing everybody at MIRI. While I am unsure where the physical location of the servers storing their website/other data is, it’s overwhelmingly likely that EMP from nuclear detonations would destroy the ability of anybody to access that data. Given the geographic distribution of LWers, it is likely that only a small-ish number would survive a massive nuclear exchange. Presumably at least a few of these individuals would attempt to carry on research and to recopy MIRI’s research/ideas for future generations, assuming that humanity will eventually recover. However, it is extremely unlikely that they will get very far, and very likely that whatever they do write down will be lost or ignored.
[*] Rereading the comment, I actually was talking about existential risks. I have edited it for clarity; I was not intending to, but adopted RobBB’s phrasing of something killing us all, while I regard nuclear risks as more likely to render MIRI useless by collapsing society. My bad.
Sorry about that. I got confused. s/you/RobBB/. I understand better now. I still believe that of the five, 3 is probably the most likely. I also 2-believe that I might overestimate that probability. (Sorry if I sound a bit strange. I’m starting to study lojban.)
(I am not RobbBB.)
1: The methods for constructing a nuclear bomb are by no means “widely known and easy to follow.” Witness the often unsuccessful struggles of many nations for decades to acquire them. Cobalt bombs are even more advanced and difficult to construct than ‘regular’ nuclear weapons.
The scenario RobBB was presumably envisioning was one in which private individuals have gained the ability to essentially destroy society using, e.g. a super-pandemic or something. A sufficient number of people have always been able to destroy human society; no new technology would be needed for everybody in the world to simultaneously commit suicide, or, for that matter, for a massive nuclear exchange. Spontaneous collective suicide is not likely. However, actions by a small group of individuals (c.f. al-Qaeda) are far more likely. Such groups do not at present have the ability to end human society as we know it; RobBB is envisioning a scenario where they gain it.
2: In the above comment, I wasn’t talking about existential risks [*]; I am not claiming that a nuclear war would be an existential risk. For any conventional nuclear war of significant size, SF/Berkley would almost certainly be targeted, killing everybody at MIRI. While I am unsure where the physical location of the servers storing their website/other data is, it’s overwhelmingly likely that EMP from nuclear detonations would destroy the ability of anybody to access that data. Given the geographic distribution of LWers, it is likely that only a small-ish number would survive a massive nuclear exchange. Presumably at least a few of these individuals would attempt to carry on research and to recopy MIRI’s research/ideas for future generations, assuming that humanity will eventually recover. However, it is extremely unlikely that they will get very far, and very likely that whatever they do write down will be lost or ignored.
[*] Rereading the comment, I actually was talking about existential risks. I have edited it for clarity; I was not intending to, but adopted RobBB’s phrasing of something killing us all, while I regard nuclear risks as more likely to render MIRI useless by collapsing society. My bad.
Sorry about that. I got confused. s/you/RobBB/. I understand better now. I still believe that of the five, 3 is probably the most likely. I also 2-believe that I might overestimate that probability. (Sorry if I sound a bit strange. I’m starting to study lojban.)