It’s obvious that humans don’t actually maximise a utility function; but according to the axioms, we should do so.
Given a choice between “change people” and “change axioms”, I’d be inclined to change axioms.
If you’re a psychologist and you care about describing people, change the axioms. If you’re a rationalist and you care about getting things done, change yourself.
Given a choice between “change people” and “change axioms”, I’d be inclined to change axioms.
If you’re a psychologist and you care about describing people, change the axioms. If you’re a rationalist and you care about getting things done, change yourself.