I overall agree with the points outlined above. It’s for those reasons that I don’t have a p(doom), at least if we’re only talking about AI. What one person considers “doom” may be very different from what another person does. Furthermore I’ve seen people have a p(doom) of say, 45%, which seems to be a rough ballpark when their p(extinction) is 20% and their p(takeover) is 50%. A casual observer might assume that they see extinction as the only bad outcome with a probability of 45%
I overall agree with the points outlined above. It’s for those reasons that I don’t have a p(doom), at least if we’re only talking about AI. What one person considers “doom” may be very different from what another person does. Furthermore I’ve seen people have a p(doom) of say, 45%, which seems to be a rough ballpark when their p(extinction) is 20% and their p(takeover) is 50%. A casual observer might assume that they see extinction as the only bad outcome with a probability of 45%