Yes, that’s what I meant by “takeover.” It’s distinct from ceding control voluntarily.
I do not see humanity ever fully ceding control, as distinct from accepting a lot of help and advice from AIs. Why cede control if you can get all of the upsides without losing the ability to change your mind?
Of course, if you’re accepting all of the advice, you have temporarily ceded control.
But I’m primarily concerned with humanity accidentally losing its choice through the creation of AGI that’s not aligned with the majority of humanity’s interests or desires—the classic alignment question.
What if humanity mistakenly thinks that ceding control voluntarily is temporary, when actually it is permanent because it makes the systems of power less and less adapted to human means of interaction?
Yes, that’s what I meant by “takeover.” It’s distinct from ceding control voluntarily.
I do not see humanity ever fully ceding control, as distinct from accepting a lot of help and advice from AIs. Why cede control if you can get all of the upsides without losing the ability to change your mind?
Of course, if you’re accepting all of the advice, you have temporarily ceded control.
But I’m primarily concerned with humanity accidentally losing its choice through the creation of AGI that’s not aligned with the majority of humanity’s interests or desires—the classic alignment question.
What if humanity mistakenly thinks that ceding control voluntarily is temporary, when actually it is permanent because it makes the systems of power less and less adapted to human means of interaction?