It’s not even formally correct. An autonomous AI does not need to create its own terminal goals*, and the will we give it is perfectly adequate to screw us over.
if it can’t create instrumental goals it’s not strong enough to worry about
Probably we disagree about what intelligence is. If intelligence is the ability to follow goals in the presence of obstancles the question becomes trivial. If intelligence is the ability to effectively find solutions in a given complex search space then little follows. It depends you the AI is decomposed into action and planning components and where the feedback cycles reside.
It’s not even formally correct. An autonomous AI does not need to create its own terminal goals*, and the will we give it is perfectly adequate to screw us over.
if it can’t create instrumental goals it’s not strong enough to worry about
Probably we disagree about what intelligence is. If intelligence is the ability to follow goals in the presence of obstancles the question becomes trivial. If intelligence is the ability to effectively find solutions in a given complex search space then little follows. It depends you the AI is decomposed into action and planning components and where the feedback cycles reside.