Yes the thought experiment is a fantasy. It requires an AI which takes English goals, but interprets them literally. We don’t even know how to get to an AI that takes English goals, and that’s probably FAI complete.
If you solve the problem of making an AI which wants to interpret what you want it do do correctly, you don’t need to bother telling it what to do. It already wants to do what you want it to do. There should be no need for the system to require English language inputs, any more than a calculator requires you to shout “add the numbers correctly!”
Yes the thought experiment is a fantasy. It requires an AI which takes English goals, but interprets them literally. We don’t even know how to get to an AI that takes English goals, and that’s probably FAI complete.
If you solve the problem of making an AI which wants to interpret what you want it do do correctly, you don’t need to bother telling it what to do. It already wants to do what you want it to do. There should be no need for the system to require English language inputs, any more than a calculator requires you to shout “add the numbers correctly!”