There are a lot of assumptions here, some of which I think don’t hold. But the concept, an institutional variation of the Bayesian Conspiracy, is interesting and new.
One of such assumption is that a powerful AI will require a mature, difficult to bootstrap industrial process, instead of say, something that can be created by anyone having access to a developement environment (this is where I think the analogy with nuclear weapons starts to break).
Also, the existence of a barrier controlling future technology will create powerful incentives to competing agencies and hackers.
A singleton could also shields us from Basilisks, although it would also shields from the opposite kind of acausal trade (which I’ve called Phoenixes).
There are a lot of assumptions here, some of which I think don’t hold. But the concept, an institutional variation of the Bayesian Conspiracy, is interesting and new.
One of such assumption is that a powerful AI will require a mature, difficult to bootstrap industrial process, instead of say, something that can be created by anyone having access to a developement environment (this is where I think the analogy with nuclear weapons starts to break). Also, the existence of a barrier controlling future technology will create powerful incentives to competing agencies and hackers.
A singleton could also shields us from Basilisks, although it would also shields from the opposite kind of acausal trade (which I’ve called Phoenixes).