IMO, existential paranoia makes sense in the same way it makes sense for an engineer to be paranoid about a bridge, plane, or nuclear power plant they are building: Lives are at stake and there’s no “redo” button if you don’t get it right the first time.
Wait a minute, that is a non-epistemic justification for a propositional claim. You normatively should build huge safety margins into your bridges, but it’s still erroneous to falsely overestimate the risk of a bridge collapse, even if that belief motivates the engineer to work harder.
I agree. If I had paid more attention to the discussion, I might have realized that in this case “paranoia” was strictly in reference to probability estimates and not in reference to emotions or resource allocation. Sorry everybody.
It’s not the prediction of a sensible theory of evolution. It has nothing whatsoever to do with evolution, and I struggle to figure out where the idea that it does comes from. The idea, correct or incorrect, is the result of the extrapolation of several, independent trends (in particular, nanotech and AI). We’ve managed not to kill ourselves so far, but that’s partly a matterofluck. Even if the only way we could kill ourselves was with nuclear weapons, there’s still a nontrivial chance that we would. Especially with India and Pakistan in on the game now. And there are new threats as well.
Edit: I don’t necessarily think that existential disaster is more probable than not, but I definitely think it shouldn’t be dismissed out of hand. And since people are downvoting this, I’m wondering where they disagree with that.
Evolutionary progress has an element of luck (sure we could be wiped out by a meteorite tomorrow) but negative events so far have been relatively rare.
IMO, you’re reading your trend lines wrong—failing to properly account for the decrease in warfare and the rise of surveillance technology.
We are not talking about a “nontrivial chance” here. We are talking about “existential disaster seems likely”. I read that as meaning the chances seem greater than 50%.
I don’t think that’s a prediction of any sensible theory of evolution. It seems more like existential paranoia.
IMO, existential paranoia makes sense in the same way it makes sense for an engineer to be paranoid about a bridge, plane, or nuclear power plant they are building: Lives are at stake and there’s no “redo” button if you don’t get it right the first time.
Wait a minute, that is a non-epistemic justification for a propositional claim. You normatively should build huge safety margins into your bridges, but it’s still erroneous to falsely overestimate the risk of a bridge collapse, even if that belief motivates the engineer to work harder.
I agree. If I had paid more attention to the discussion, I might have realized that in this case “paranoia” was strictly in reference to probability estimates and not in reference to emotions or resource allocation. Sorry everybody.
I don’t think that makes it OK to systematically paint an inaccurate picture of the risk to help drum up support for your cause.
I agree.
It’s not the prediction of a sensible theory of evolution. It has nothing whatsoever to do with evolution, and I struggle to figure out where the idea that it does comes from. The idea, correct or incorrect, is the result of the extrapolation of several, independent trends (in particular, nanotech and AI). We’ve managed not to kill ourselves so far, but that’s partly a matter of luck. Even if the only way we could kill ourselves was with nuclear weapons, there’s still a nontrivial chance that we would. Especially with India and Pakistan in on the game now. And there are new threats as well.
Edit: I don’t necessarily think that existential disaster is more probable than not, but I definitely think it shouldn’t be dismissed out of hand. And since people are downvoting this, I’m wondering where they disagree with that.
Evolutionary progress has an element of luck (sure we could be wiped out by a meteorite tomorrow) but negative events so far have been relatively rare.
IMO, you’re reading your trend lines wrong—failing to properly account for the decrease in warfare and the rise of surveillance technology.
We are not talking about a “nontrivial chance” here. We are talking about “existential disaster seems likely”. I read that as meaning the chances seem greater than 50%.
I find it very, very hard to estimate the actual chances of any particular existential disaster. I would not put that chance below 20% this century.