If we’re talking about brain implants and advanced AI, the the singularity would occur by the time we reach this level of development. The problem is: what if superweapons occur before superintelligence?
Like, say, in 1945?
I don’t think what I described would require a super-intelligence.
No, but the scenario you’re describing reminds me very much of the post on the definition of existential threat. In particular,
A totalitarian regime takes control of earth. It uses mass surveillance to prevent any rebellion, and there is no chance for escape.
Networking loads of brains together is one of the more eclectic proposals on how to create a super-intelligence.
The simpler proposal of panopticon surveillance plus AI to interpret the data might be doable without AGI however.
If we’re talking about brain implants and advanced AI, the the singularity would occur by the time we reach this level of development. The problem is: what if superweapons occur before superintelligence?
Like, say, in 1945?
I don’t think what I described would require a super-intelligence.
No, but the scenario you’re describing reminds me very much of the post on the definition of existential threat. In particular,
Networking loads of brains together is one of the more eclectic proposals on how to create a super-intelligence.
The simpler proposal of panopticon surveillance plus AI to interpret the data might be doable without AGI however.