Much of the world would likely support total drone surveillance of certain countries. Also, in fifty years we could probably put recording devices in peoples’ brains that tell us everything they say and hear, and combine this with AI to immediately identify any terrorist threats.
If we’re talking about brain implants and advanced AI, the the singularity would occur by the time we reach this level of development. The problem is: what if superweapons occur before superintelligence?
Much of the world would likely support total drone surveillance of certain countries. Also, in fifty years we could probably put recording devices in peoples’ brains that tell us everything they say and hear, and combine this with AI to immediately identify any terrorist threats.
If we’re talking about brain implants and advanced AI, the the singularity would occur by the time we reach this level of development. The problem is: what if superweapons occur before superintelligence?
Like, say, in 1945?
I don’t think what I described would require a super-intelligence.
No, but the scenario you’re describing reminds me very much of the post on the definition of existential threat. In particular,
Networking loads of brains together is one of the more eclectic proposals on how to create a super-intelligence.
The simpler proposal of panopticon surveillance plus AI to interpret the data might be doable without AGI however.