You estimate the government might press ahead even with 9% probability of extinction. If every competing government takes on a different risk of this magnitude—perhaps a risk of their own personal failure that is really independent of competitors, as with the risk of releasing an AI that turns out to be Unfriendly—then with 10 such projects we have 90% total probability of the extinction of all life.
Um, hypothetically, once the first SIAI is released (Friendly or not) it isn’t going to give the next group a go.
Only the odds on the first one to be released matter, so they can’t multiply to a 90% risk.
With that said, you’re right that it would be a good thing for governments to take existential risks seriously, just like it would be a good thing for pretty much everyone to take them seriously, ya?
Um, hypothetically, once the first SIAI is released (Friendly or not) it isn’t going to give the next group a go.
Only the odds on the first one to be released matter, so they can’t multiply to a 90% risk.
With that said, you’re right that it would be a good thing for governments to take existential risks seriously, just like it would be a good thing for pretty much everyone to take them seriously, ya?