I don’t think this model of AI impact is very useful. “disempower” is too poorly defined to really reason about. Alternately, you’re overfocused on independence of AI to do things that (many) humans wish it wouldn’t. Much more likely is AI empowering very small groups of humans to be even more exploitative of the majority of humans. This doesn’t need to be AGI or even independent, or even AI—economic and industrial/tool efficiency could obsolete masses of humans.