It seems to me that believing ASI can kill you and believing ASI can save you are both pretty directly downstream of believing in ASI at all. Since the premise is that everyone believes pretty strongly in the possibility of doom, it seems they’d mostly get there by believing in ASI and would mostly also believe in the upside potentials too.
Yes. But because we’re discussing a scenario in which the world is ready to slow down or shut down AGI research, I’m assuming those steps have been crossed.
The biggest step IMO, “alignment is hard” doesn’t intervene between taking ASI seriously and thinking it could prevent you from dying of natural causes.
It seems to me that believing ASI can kill you and believing ASI can save you are both pretty directly downstream of believing in ASI at all. Since the premise is that everyone believes pretty strongly in the possibility of doom, it seems they’d mostly get there by believing in ASI and would mostly also believe in the upside potentials too.
There are several intermediate steps in the argument from ASI to doom.
Yes. But because we’re discussing a scenario in which the world is ready to slow down or shut down AGI research, I’m assuming those steps have been crossed.
The biggest step IMO, “alignment is hard” doesn’t intervene between taking ASI seriously and thinking it could prevent you from dying of natural causes.