I realize that it’s a map of risks, I’m just saying the possibilities don’t even remotely fall into comparable levels of risk. “Death from nuclear ICBM” is quite imaginable and possible. Not only that, there was a time when it almost seemed imminent and inevitable. And it could easily become that way again. Whereas “death from cold fusion” is essentially of zero meaningful concern.
Maybe it would be useful if you could attach some kind of crude probabilities to your estimates. I can fill a pdf with items like “death from massive leprechaun attack” but it wouldn’t be a very useful guide.
While I do not appreciate your wording “death from cold fusion” when we speak about risks of proliferation connected with new technologies, I already added some kind of probability estimation to the map and painted boxes in one of three colors. But instead of probability I used “Importance of risks”, which more clearly connected with what we should do to prevent them.
“Importance (or urgency) of risks is subjectively estimated based on their probability, timing, magnitude of expected effect and scientific basis for the risk. Importance here means how much attention and efforts we should put to control the risk.
Green – just keep it in mind, do nothing
Yellow – pay attention, do reasonable efforts to prevent
Red – pay immediate attention to prevent”
The pdf is here: http://immortality-roadmap.com/nukerisk2.pdf
In it only two risks are red: nuclear war and nuclear-biological war.
The risks of large scale proliferation connected with new technologies is yellow.
I realize that it’s a map of risks, I’m just saying the possibilities don’t even remotely fall into comparable levels of risk. “Death from nuclear ICBM” is quite imaginable and possible. Not only that, there was a time when it almost seemed imminent and inevitable. And it could easily become that way again. Whereas “death from cold fusion” is essentially of zero meaningful concern.
Maybe it would be useful if you could attach some kind of crude probabilities to your estimates. I can fill a pdf with items like “death from massive leprechaun attack” but it wouldn’t be a very useful guide.
While I do not appreciate your wording “death from cold fusion” when we speak about risks of proliferation connected with new technologies, I already added some kind of probability estimation to the map and painted boxes in one of three colors. But instead of probability I used “Importance of risks”, which more clearly connected with what we should do to prevent them.
“Importance (or urgency) of risks is subjectively estimated based on their probability, timing, magnitude of expected effect and scientific basis for the risk. Importance here means how much attention and efforts we should put to control the risk.
Green – just keep it in mind, do nothing Yellow – pay attention, do reasonable efforts to prevent Red – pay immediate attention to prevent” The pdf is here: http://immortality-roadmap.com/nukerisk2.pdf
In it only two risks are red: nuclear war and nuclear-biological war.
The risks of large scale proliferation connected with new technologies is yellow.
and the risk of Jupiter detonation is green.