How many things could reasonably have a p(doom) > 0.01? Not very many. Therefore your worry about me “neurotically obsessing over tons of things” is unfounded. I promise I won’t :) If my post causes you to think that, then I apologize, I have misspoken my argument.
What is the actual argument that there’s ‘not very many’? (Or why do you believe such an argument made somewhere else)
There’s hundreds of asteroids and comets alone that have some probability of hitting the Earth in the next thousand years, how can anyone possibly evaluate ‘p(doom)’ for any of this, let alone every other possible catastrophe?
I was reading the UK National Risk Register earlier today and thinking about this. Notable to me that the top-level disaster severity has a very low cap of ~thousands of casualties, or billions of economic loss. Although it does note in the register that AI is a chronic risk that is being managed under a new framework (that I can’t find precedent for).
How many things could reasonably have a p(doom) > 0.01? Not very many. Therefore your worry about me “neurotically obsessing over tons of things” is unfounded. I promise I won’t :) If my post causes you to think that, then I apologize, I have misspoken my argument.
What is the actual argument that there’s ‘not very many’? (Or why do you believe such an argument made somewhere else)
There’s hundreds of asteroids and comets alone that have some probability of hitting the Earth in the next thousand years, how can anyone possibly evaluate ‘p(doom)’ for any of this, let alone every other possible catastrophe?
I was reading the UK National Risk Register earlier today and thinking about this. Notable to me that the top-level disaster severity has a very low cap of ~thousands of casualties, or billions of economic loss. Although it does note in the register that AI is a chronic risk that is being managed under a new framework (that I can’t find precedent for).