I believe von Neumann and Morganstern showed that you could ask people questions about ordinal preferences (would you prefer x to y) and from a number of such questions (if they’re consistent), construct cardinal preferences—which would be turning real goals and desires into utils.
Haven’t various psychological experiments shown that such self-reported preferences are usually inconsistent? (I’ve seen various refs and examples here on LW, although I can’t remember one offhand...)
Oh, sure. (Eliezer has a post on specific human inconsistencies from the OB days.) But this is a theoretical result, saying we can go from specific choices - ‘revealed preferences’ - to a utility function/set of cardinal preferences which will satisfy those choices, if those choices are somewhat rational. Which is exactly what billswift asked for.
(And I’d note the issue here is not what do humans actually use when assessing small probabilities, but what they should do. If we scrap expected utility, it’s not clear what the right thing is; which is what my other comment is about.)
I believe von Neumann and Morganstern showed that you could ask people questions about ordinal preferences (would you prefer x to y) and from a number of such questions (if they’re consistent), construct cardinal preferences—which would be turning real goals and desires into utils.
Haven’t various psychological experiments shown that such self-reported preferences are usually inconsistent? (I’ve seen various refs and examples here on LW, although I can’t remember one offhand...)
Oh, sure. (Eliezer has a post on specific human inconsistencies from the OB days.) But this is a theoretical result, saying we can go from specific choices - ‘revealed preferences’ - to a utility function/set of cardinal preferences which will satisfy those choices, if those choices are somewhat rational. Which is exactly what billswift asked for.
(And I’d note the issue here is not what do humans actually use when assessing small probabilities, but what they should do. If we scrap expected utility, it’s not clear what the right thing is; which is what my other comment is about.)