I won’t address why [AIs that humans create] might[1] have their own alien values (so I won’t address the “turning against us” part of your comment), but on these AIs outcompeting humans[2]:
There is immense demand for creating systems which do ≈anything better than humans, because there is demand for all the economically useful things humans do — if someone were to create such a thing and be able to control it, they’d become obscenely rich (and probably come to control the world[3]).
I won’t address why [AIs that humans create] might[1] have their own alien values (so I won’t address the “turning against us” part of your comment), but on these AIs outcompeting humans[2]:
There is immense demand for creating systems which do ≈anything better than humans, because there is demand for all the economically useful things humans do — if someone were to create such a thing and be able to control it, they’d become obscenely rich (and probably come to control the world[3]).
Also, it’s possible to create systems that do ≈anything better than humans. In fact, it’s probably not that hard — it’ll probably happen at some point in this century by default (absent an AGI ban).
and imo probably will
sorry if this is already obvious to you, but I thought from your comment that there was a chance you hadn’t considered this
if moderately ahead of other developers and not shut down or taken over by others promptly