Do you put any probability on “superintelligence is uninterested in autonomy”? It may find us humans much much more interesting than we do. It might want to observe how far we (Humans + AI) go much more than how far it can go. Are you in full agreement with Instrumental convergence?
Do you put any probability on “superintelligence is uninterested in autonomy”? It may find us humans much much more interesting than we do. It might want to observe how far we (Humans + AI) go much more than how far it can go.
Are you in full agreement with Instrumental convergence?