An example I had in my head was something like “Human wants food, I’ll make a bowl of pasta” vs “I want human to survive and will feed them, whether they want to eat or not because they want to survive, too”. I am not sure why the latter is needed if that is what you are saying.
If you have a complex goal and don’t know the steps that would be required to solve the goal “does what you want” is not enough.
If you however have “wants what you want” the AGI can figure out the necessary steps.
An example I had in my head was something like “Human wants food, I’ll make a bowl of pasta” vs “I want human to survive and will feed them, whether they want to eat or not because they want to survive, too”. I am not sure why the latter is needed if that is what you are saying.