Came here to say basically the same thing Zach Stein-Perlman said in his comment, “This isn’t taking AGI seriously.” The definition of AGI, in my mind, is that it will be an actor at least on par with humans. And it is highly likely to be radically superhuman in at least a few important ways (e.g. superhuman speed, rapid replication, knowledge-merging).
I think it’s highly unlikely that there is anything but a brief period (<5 years) where there is superhuman AGI but humanity is fully in control. We will quickly move to a regime where there are independent digital entities, and then to one where the independent digital entities have more power than all of humanity and it’s controlled AGIs have put together. Either these new digital superbeings love us and we flourish (including potentially being uplifted into their modality via BCI, uploading and intelligence enhancement) or we perish.
Came here to say basically the same thing Zach Stein-Perlman said in his comment, “This isn’t taking AGI seriously.” The definition of AGI, in my mind, is that it will be an actor at least on par with humans. And it is highly likely to be radically superhuman in at least a few important ways (e.g. superhuman speed, rapid replication, knowledge-merging).
I think it’s highly unlikely that there is anything but a brief period (<5 years) where there is superhuman AGI but humanity is fully in control. We will quickly move to a regime where there are independent digital entities, and then to one where the independent digital entities have more power than all of humanity and it’s controlled AGIs have put together. Either these new digital superbeings love us and we flourish (including potentially being uplifted into their modality via BCI, uploading and intelligence enhancement) or we perish.