Another potential problem with the first scenario: the AI is indifferent about every long-term consequence of its actions, not just how many paper clips it gets long-term. If it finds a plan that creates a small number of paperclips immediately but results in the universe being destroyed tomorrow, it takes it.
Another potential problem with the first scenario: the AI is indifferent about every long-term consequence of its actions, not just how many paper clips it gets long-term. If it finds a plan that creates a small number of paperclips immediately but results in the universe being destroyed tomorrow, it takes it.