For example, suppose my utility function is U(Universe) = #paperclips, which is unbounded in a big universe. Then you’re going to normalise me as assigning U(AI becomes clippy) = 1, and U(individual paperclips) = 0.
Yep.
So most likely a certain proportion of the universe will become paperclips.
Yep.
So most likely a certain proportion of the universe will become paperclips.