I would say that I believe that control of an AI by continuing trade is ‘necessary’ if we expect that our desires will change over time, and we will want to nudge the AI (or build a new AI) to satisfy those unanticipated desires.
Well, that surely isn’t right. Asimov knew that! He proposed making the machines want to do what we want them to—by making them following our instructions.
Well, that surely isn’t right. Asimov knew that! He proposed making the machines want to do what we want them to—by making them following our instructions.