“Rational expected-utility-maximizing agents get to care about whatever the hell they want.”—a good heuristic to bear in mind. There really are an awful lot of orderings on possible worlds, and if value is complex, your utility function* probably isn’t linear.
*usual disclaimers apply about not actually having one.
“Rational expected-utility-maximizing agents get to care about whatever the hell they want.”—a good heuristic to bear in mind. There really are an awful lot of orderings on possible worlds, and if value is complex, your utility function* probably isn’t linear.
*usual disclaimers apply about not actually having one.