And the robot replies, “Does taking this action of precommitment cause the biggest increase in utility of currently existing people?”
I’d say yes. It gives an additional 1 utility to currently existing people, since it ensures that the robot will make a choice that people like later on.
Are you only counting the amount they value the world as it currently is? For example, if someone wants to be buried when they die, the robot wouldn’t arrange it, because by the time it happens they won’t be in a state to appreciate it?
I’d say yes. It gives an additional 1 utility to currently existing people, since it ensures that the robot will make a choice that people like later on.
Are you only counting the amount they value the world as it currently is? For example, if someone wants to be buried when they die, the robot wouldn’t arrange it, because by the time it happens they won’t be in a state to appreciate it?
Ooooh. Okay, I see what you mean now—for some reason I’d interpreted you as saying almost the opposite.
Yup, I was wrong.