Then unless we are *extremely confident *that there is no imminent doom, we should still be scaring people. Scaring people is bad and we want to avoid it, but we want to avoid extinction a whole lot more. Given that, I think it makes the most sense to go with something like a 99th percentile date.
This sort of reasoning leads to Pascal’s Mugging and lots of weirdness like concern for the welfare of insects or electrons.
We should not be acting on the basis of extremely unlikely events because the events lead to a very large change in utility.
This just changes the quantile you should use. I already tried to adjust for this in the ballpark number I gave — I in fact think existential doom is much more than 99 times worse than scaring people. Even if you disagree and say it’s not worth trying to avoid an outcome because it has only a 1% probability, you’d probably agree it’s worth it at 50%, and it seems to me the median timeline is still way past 2028.
This sort of reasoning leads to Pascal’s Mugging and lots of weirdness like concern for the welfare of insects or electrons.
We should not be acting on the basis of extremely unlikely events because the events lead to a very large change in utility.
This just changes the quantile you should use. I already tried to adjust for this in the ballpark number I gave — I in fact think existential doom is much more than 99 times worse than scaring people. Even if you disagree and say it’s not worth trying to avoid an outcome because it has only a 1% probability, you’d probably agree it’s worth it at 50%, and it seems to me the median timeline is still way past 2028.