Does Clippy maximise number-of-paperclips-in-universe (given all available information) or some proxy variable like number-of-paperclips-counted-so-far? If the former, Clippy does not want to move to a simulation. If the latter, Clippy does want to move to a simulation.
I’m not certain that’s so, as ISTM many of the things humanity wants to maximize are to a large extent representation-invariant—in particular because they refer to other people—and could be done just as well in a simulation. The obvious exception being actual knowledge of the outside world.
I maximize the number of papercips in the universe (that exist an arbitrarily long time from now). I use “number of paperclips counted so far” as a measure of progress, but it is always screened off by more direct measures, or expected quantities, of paperclips in the universe.
Does Clippy maximise number-of-paperclips-in-universe (given all available information) or some proxy variable like number-of-paperclips-counted-so-far? If the former, Clippy does not want to move to a simulation. If the latter, Clippy does want to move to a simulation.
The same analysis applies to humankind.
I’m not certain that’s so, as ISTM many of the things humanity wants to maximize are to a large extent representation-invariant—in particular because they refer to other people—and could be done just as well in a simulation. The obvious exception being actual knowledge of the outside world.
I maximize the number of papercips in the universe (that exist an arbitrarily long time from now). I use “number of paperclips counted so far” as a measure of progress, but it is always screened off by more direct measures, or expected quantities, of paperclips in the universe.