Really x-risk mitigation is for dessert, though. There are so many failures of our society that are lower-hanging in terms of how rational you have to be to see them.
For example, the spending of $1,000,000,000,000 ($10^12) on the Iraq and Afghan wars. Just think about the good that could have been done with that money if it had gone into science and medical research. Imagine putting 1% of that money towards life-extension research—that’s $10,000,000,000. Then cry.
That is a lot of zeros. I had to count them twice on http://costofwar.com/ to make sure I got it right.
Really x-risk mitigation is for dessert, though. There are so many failures of our society that are lower-hanging in terms of how rational you have to be to see them.
I’d say it is more “so you get to dessert”! There as a backup behind “create an FAI and thereby cure death”. (So I’m being a bit reckless and blurring my categories when I put Xrisk at 50%.)
Really x-risk mitigation is for dessert, though. There are so many failures of our society that are lower-hanging in terms of how rational you have to be to see them.
I’d say it is more “so you get to dessert”! There as a backup behind “create an FAI and thereby cure death”.
I agree totally that FAI is the single most important problem, but it is hard to see that. As far as I know, Bostrom/Eliezer noticed it as a problem about 10-13 years ago, before which not one person in the world had seen it as a problem. If our world had at least solved the easier-to-see problems, then one would have some confidence that FAI would at least be considered.
For deleted context of parent please refer to grand-aunt. (Courtesy of a bizarre bug somewhere in which every comment and PM reply of mine was being posted twice and, evidently, fast response by Roko.)
Or, well, not spending half of the GDP on existential risk mitigation...
Really x-risk mitigation is for dessert, though. There are so many failures of our society that are lower-hanging in terms of how rational you have to be to see them.
For example, the spending of $1,000,000,000,000 ($10^12) on the Iraq and Afghan wars. Just think about the good that could have been done with that money if it had gone into science and medical research. Imagine putting 1% of that money towards life-extension research—that’s $10,000,000,000. Then cry.
That is a lot of zeros. I had to count them twice on http://costofwar.com/ to make sure I got it right.
I’d say it is more “so you get to dessert”! There as a backup behind “create an FAI and thereby cure death”. (So I’m being a bit reckless and blurring my categories when I put Xrisk at 50%.)
That $10b on life extension sounds about right!
I’d say it is more “so you get to dessert”! There as a backup behind “create an FAI and thereby cure death”.
I agree totally that FAI is the single most important problem, but it is hard to see that. As far as I know, Bostrom/Eliezer noticed it as a problem about 10-13 years ago, before which not one person in the world had seen it as a problem. If our world had at least solved the easier-to-see problems, then one would have some confidence that FAI would at least be considered.
For deleted context of parent please refer to grand-aunt. (Courtesy of a bizarre bug somewhere in which every comment and PM reply of mine was being posted twice and, evidently, fast response by Roko.)
ETA: And I agree with parent.
Sure. I’d count that under “irrational social epistemology”.