I feel like every time I write a comment, I have to add a caveat about how I’m not as doomy as MIRI and I somewhat disagree with their predictions. But like, I don’t actually think that matters. If you think there’s a 5% or 20% chance of extinction from ASI, you should be sounding the alarm just as loudly as MIRI is! Or maybe 75% as loudly or something. But not 20% as loudly—how much you should care about raising concern for ASI is not a linear function of your P(doom).
I feel like every time I write a comment, I have to add a caveat about how I’m not as doomy as MIRI and I somewhat disagree with their predictions. But like, I don’t actually think that matters. If you think there’s a 5% or 20% chance of extinction from ASI, you should be sounding the alarm just as loudly as MIRI is! Or maybe 75% as loudly or something. But not 20% as loudly—how much you should care about raising concern for ASI is not a linear function of your P(doom).