There are probably several things where I would broadly agree with you, however your post would be much better without the condescending tone. And perhaps without all the non sequiturs:
If the rest of the world is underconfident about these ideas, then these investments would surely have an enormous expected rate of return.
Why? If people don’t believe that cryonics will work, you can’t sell it to them for a lot of money even if they are wrong. (Disclaimer: I haven’t signed for cryonics.)
How many people responding to this survey have actually made significant personal preparations for survival, like a fallout shelter with food and so on which would actually be useful under most of the different scenarios listed?
If you believed that there is going to be a nuclear war in 90 years, would you start buying the food and preparing the shelter just now?
The risks listed in the survey results were pandemic (bioengineered), environmental collapse, unfriendly AI, nuclear war, economic/political collapse, pandemic (natural), nanotech, asteroid. Few of them could be short term catastrophes with critical first few weeks that can be survived in a shelter, but not necessarily. If we are speaking about a disaster wiping out 90% of the global population or more, it’s pretty good to assume that lot of people are thinking of an event which renders Earth unlivable, with a shelter or without it.
People can prefer death to living in a post-apocalypse world. (Or prefer “normal” pre-apocalyptic life and then death to life spent in preparation for the apocalypse and survival.)
The question was “which disaster do you think is most likely...”. Therefore, if 23% answer bioengineered pandemics, it doesn’t imply that 23% of people actually consider bioengineered pandemics probable. It can mean that it is less improbable than the rest of the list.
That no more than 5% of LW readers are preparing a shelter (likely a correct guess) is an argument for what, exactly? It can be evidence that the general LW opinion is actually closer to yours than you seem to believe, or it can be evidence that people are procrastinating, but it certainly doesn’t imply “grand level of overconfidence in the probabilities of any of these [catastrophes] occurring”.
(Disclaimer: I don’t especially fear future global catastrophes and moreover don’t think that we can predict them significantly better than by random guess.)
The questions on dust specks vs torture and Newcomb’s Problem are so unlikely to ever be relevant in reality that I view discussion about them as worthless.
Relevant to what? It seems that those discussions were intended as illustrations of theoretical problems with common utilitarian and decision-theoretic intuitions. Learning that one’s intuitions have bounded domain and don’t work well in extreme unrealistic scenarios isn’t perhaps a life changing achievement, but it is at least interesting. Perhaps not interesting to you, but not interesting to you and worthless are different things. (Disclaimer: I don’t think that having correct answer to Newcomb and dust specks is going to be practically important in and of itself.)
There are probably several things where I would broadly agree with you, however your post would be much better without the condescending tone. And perhaps without all the non sequiturs:
Why? If people don’t believe that cryonics will work, you can’t sell it to them for a lot of money even if they are wrong. (Disclaimer: I haven’t signed for cryonics.)
If you believed that there is going to be a nuclear war in 90 years, would you start buying the food and preparing the shelter just now?
The risks listed in the survey results were pandemic (bioengineered), environmental collapse, unfriendly AI, nuclear war, economic/political collapse, pandemic (natural), nanotech, asteroid. Few of them could be short term catastrophes with critical first few weeks that can be survived in a shelter, but not necessarily. If we are speaking about a disaster wiping out 90% of the global population or more, it’s pretty good to assume that lot of people are thinking of an event which renders Earth unlivable, with a shelter or without it.
People can prefer death to living in a post-apocalypse world. (Or prefer “normal” pre-apocalyptic life and then death to life spent in preparation for the apocalypse and survival.)
The question was “which disaster do you think is most likely...”. Therefore, if 23% answer bioengineered pandemics, it doesn’t imply that 23% of people actually consider bioengineered pandemics probable. It can mean that it is less improbable than the rest of the list.
That no more than 5% of LW readers are preparing a shelter (likely a correct guess) is an argument for what, exactly? It can be evidence that the general LW opinion is actually closer to yours than you seem to believe, or it can be evidence that people are procrastinating, but it certainly doesn’t imply “grand level of overconfidence in the probabilities of any of these [catastrophes] occurring”.
(Disclaimer: I don’t especially fear future global catastrophes and moreover don’t think that we can predict them significantly better than by random guess.)
Relevant to what? It seems that those discussions were intended as illustrations of theoretical problems with common utilitarian and decision-theoretic intuitions. Learning that one’s intuitions have bounded domain and don’t work well in extreme unrealistic scenarios isn’t perhaps a life changing achievement, but it is at least interesting. Perhaps not interesting to you, but not interesting to you and worthless are different things. (Disclaimer: I don’t think that having correct answer to Newcomb and dust specks is going to be practically important in and of itself.)