If someone thinks ASI will likely go catastrophically poorly if we develop it in something like current race dynamics, they are more likely to work on Plan 1.
If someone thinks we are likely to make ASI go well if we just put in a little safety effort, or thinks it’s at least easier than getting strong international slowdown, they are more likely to work on Plan 2.
Should depend on neglectedness more than credence. If you think ASI will likely go catastrophically poorly, but nobody is working on putting in a little safety effort in case it doesn’t (with such effort), that’s worth doing more of then. Credence determines the shape of good allocation of resources, but all major possibilities should be prepared for to some extent.
Should depend on neglectedness more than credence. If you think ASI will likely go catastrophically poorly, but nobody is working on putting in a little safety effort in case it doesn’t (with such effort), that’s worth doing more of then. Credence determines the shape of good allocation of resources, but all major possibilities should be prepared for to some extent.