You are assuming that the AI needs something from us, which may not be true as it develops further. The decorator follows the implied wishes not because he is smart enough to know what they are, but because he wishes to act in his client’s interest to gain payment, reputation, etc. Or he may believe that fulfilling his client’s wishes are morally good according to his morality. The mere fact that the wishes of his client are known does not guarantee that he will carry them out unless he values the client in some way to begin with (for their money or maybe their happiness)
You are assuming that an .AI will last have only instrumental rationality. That the OT is true.
You are assuming that an .AI will last have only instrumental rationality. That the OT is true.