VNM utility is a gross oversimplification of the types of complex value judgements a brain uses.
A better approximation of human value would be a vector utility function, and then all the VNM axioms beyond 1 break down.
The closest humans ever get to a scalar utility function is with money, but money never represents our actual utility.
$1 million dollars isn’t 1000 better than $1000, for example. A particular X dollar evaluation is a gross approximation of a complex vector utility representing the set of things you could do with that money.
I wasn’t saying that money=utility, I was just claiming that utility exists. Are you saying axiom 2 is inappropriate to use when modeling human preferences? Can you give any example of three things that preference is not transitive over? Obviously you could trick irrational brains into violating transitivity but, on a full rational analysis, all our preferences are transitive. Given the choice between irrational-brain-value-judgments and brain-judgments + rational reflection, I’d choose the second every time (except maybe some pathological cases that I don’t care enough about to bother checking whether they’re actually possible).
Can you give any example of three things that preference is not transitive over?
1.
A(paper), B(rock), C(scissors). A>B>C>A
2.
Transitivity may not hold in situations where all the choices are not available at once. For example given activity choices A(fishing), B(dancing), C(reading) I may pick A>B>C when made aware of all choices, but in isolate may pick A>B, B>C, C>A.
This becomes more pronounced as you factor in time to evaluate choices (such as complex potential futures)
This doesn’t seem to have anything to do with intransitive preferences. Paper beats rock if they are played against each other, but you don’t prefer paper over rock. Well, you do if your opponent plays scissors, or if you have some foreknowledge that they’re especially likely to play scissors, but in the absence of that...
Two isolated tribes in the Amazon meet every summer solstice at the Ziggurat of the Nameless Raven-God. In the previous year the elders of each respective tribe have debated frantically amongst themselves which sacrifice to bring the nameless one to win his favor for the coming year.
It is said that the Ashen-Feathered Night prefers its own kind as a sacrifice, at least over an offering of obsidian, for the latter lacks the inky blackness of the feathered kind.
In its aspect as the Demon Gate of Truth, however, the nameless one would rather feast on the rotten, amputated limbs of those maimed over the previous year. This gift would pale in comparison to ravens, who of course speak only lies.
Finally, the nameless one is also the Endless Macuahuitl, which requires precious obsidian blades to lengthen its diabolically long grinding edge. Bound by ancient law to only sever living flesh, it would recoil in anger upon an offering of dead flesh.
Three aspects then, for the same terrible Nameless Raven-God.
So, late into Midsummer’s Eve the elders debate which aspect the other tribe will attempt to please. Raven, flesh, or obsidian—only a gift more favorable to the nameless one will convince it to withhold its baleful curses from one tribe.
I would advise them to write down a probability distribution and calculate the utilities of pleasing vs displeasing the Raven-God; that transitivity holds should then be obvious.
The god’s preferences depend on its state. It prefers Ashen-Feathered Night+raven over Ashen-Feathered Night+obsidian, but does not in general prefer raven to obsidian. A preference must take into account all relevant factors.
The case in strategy games is not intransitive. Given any distribution, there is an optimal play against it. For example, if my opponent played 40% rock, 30% paper, and 30% scissors, I would prefer paper, then rock, then scissors. If your opponent plays all three equally, there are no preferences, not circular preferences. Randomization is used to prevent the opponent from gaining information about you. If you could use a pseudorandom method to exploit failures in their cognition and win in the long run, that is a preferable strategy.
In none of these cases would I decide to throw rock, then realize paper is better and change my choice, then realize scissors it better ad infinitum. Paper is not necessarily a better choice than rock, it would just beat rock in a game. Equating these two concepts is a level confusion.
Transitivity may not hold in situations where all the choices are not available at once. For example given activity choices A(fishing), B(dancing), C(reading) I may pick A>B>C when made aware of all choices, but in isolate may pick A>B, B>C, C>A.
Either way, if axiom 2 were interpreted as referring to choices made when all options were known, for example if you knew you could fish, dance, or read and were asked to rank among all of them, the VNM theorem would still work. In this case, you would never say C is better than A because you would always be aware of B.
If I and my opponent only have A(paper), B(rock) to choose from, then always A > B. Likewise B>C, C>A.
If you and your opponent only have paper and rock to choose from, this is correct. But if that is the case, then you are not considering two options within the existing game, you are considering a different game entirely. To equate your preference for paper over rock in a game of Rock-Paper, with a preference for paper over rock in a game of Rock-Paper-Scissors, is a confusion. In that case, the scenario would read, “My opponent can throw Rock, Paper, or Scissors; if we assume I don’t want to go Scissors (but my opponent does not know this), what should I do?” Within the given game, there are no intransitive preferences.
VNM utility is a gross oversimplification of the types of complex value judgements a brain uses.
A better approximation of human value would be a vector utility function, and then all the VNM axioms beyond 1 break down.
The closest humans ever get to a scalar utility function is with money, but money never represents our actual utility.
$1 million dollars isn’t 1000 better than $1000, for example. A particular X dollar evaluation is a gross approximation of a complex vector utility representing the set of things you could do with that money.
I wasn’t saying that money=utility, I was just claiming that utility exists. Are you saying axiom 2 is inappropriate to use when modeling human preferences? Can you give any example of three things that preference is not transitive over? Obviously you could trick irrational brains into violating transitivity but, on a full rational analysis, all our preferences are transitive. Given the choice between irrational-brain-value-judgments and brain-judgments + rational reflection, I’d choose the second every time (except maybe some pathological cases that I don’t care enough about to bother checking whether they’re actually possible).
1. A(paper), B(rock), C(scissors). A>B>C>A
2. Transitivity may not hold in situations where all the choices are not available at once. For example given activity choices A(fishing), B(dancing), C(reading) I may pick A>B>C when made aware of all choices, but in isolate may pick A>B, B>C, C>A.
This becomes more pronounced as you factor in time to evaluate choices (such as complex potential futures)
This doesn’t seem to have anything to do with intransitive preferences. Paper beats rock if they are played against each other, but you don’t prefer paper over rock. Well, you do if your opponent plays scissors, or if you have some foreknowledge that they’re especially likely to play scissors, but in the absence of that...
Two isolated tribes in the Amazon meet every summer solstice at the Ziggurat of the Nameless Raven-God. In the previous year the elders of each respective tribe have debated frantically amongst themselves which sacrifice to bring the nameless one to win his favor for the coming year.
It is said that the Ashen-Feathered Night prefers its own kind as a sacrifice, at least over an offering of obsidian, for the latter lacks the inky blackness of the feathered kind.
In its aspect as the Demon Gate of Truth, however, the nameless one would rather feast on the rotten, amputated limbs of those maimed over the previous year. This gift would pale in comparison to ravens, who of course speak only lies.
Finally, the nameless one is also the Endless Macuahuitl, which requires precious obsidian blades to lengthen its diabolically long grinding edge. Bound by ancient law to only sever living flesh, it would recoil in anger upon an offering of dead flesh.
Three aspects then, for the same terrible Nameless Raven-God.
So, late into Midsummer’s Eve the elders debate which aspect the other tribe will attempt to please. Raven, flesh, or obsidian—only a gift more favorable to the nameless one will convince it to withhold its baleful curses from one tribe.
I would advise them to write down a probability distribution and calculate the utilities of pleasing vs displeasing the Raven-God; that transitivity holds should then be obvious.
The god’s preferences are intransitive. I don’t know how to make this clearer.
The god’s preferences depend on its state. It prefers Ashen-Feathered Night+raven over Ashen-Feathered Night+obsidian, but does not in general prefer raven to obsidian. A preference must take into account all relevant factors.
Gods are not humans, nor rational. The only entities making actual choices are the tribes.
Eh, forget it. I’m turning in my Bardic Conspiracy membership.
Eh. I liked the story, and the imagery, I just didn’t find it at all a good argument.
It’s kind of silly, but I’m thinking of the subset games where you only ever get 2 options.
If I and my opponent only have A(paper), B(rock) to choose from, then always A > B. Likewise B>C, C>A.
I’m not sure how this maps to larger practical situations, but one may be able to make some analogy out of it.
Actually, the rock papers scissors comes up in strategy games frequently.
The case in strategy games is not intransitive. Given any distribution, there is an optimal play against it. For example, if my opponent played 40% rock, 30% paper, and 30% scissors, I would prefer paper, then rock, then scissors. If your opponent plays all three equally, there are no preferences, not circular preferences. Randomization is used to prevent the opponent from gaining information about you. If you could use a pseudorandom method to exploit failures in their cognition and win in the long run, that is a preferable strategy.
In none of these cases would I decide to throw rock, then realize paper is better and change my choice, then realize scissors it better ad infinitum. Paper is not necessarily a better choice than rock, it would just beat rock in a game. Equating these two concepts is a level confusion.
Would you really call that rational? If my brain behaved this way, I would attempt to correct it.
Either way, if axiom 2 were interpreted as referring to choices made when all options were known, for example if you knew you could fish, dance, or read and were asked to rank among all of them, the VNM theorem would still work. In this case, you would never say C is better than A because you would always be aware of B.
If you and your opponent only have paper and rock to choose from, this is correct. But if that is the case, then you are not considering two options within the existing game, you are considering a different game entirely. To equate your preference for paper over rock in a game of Rock-Paper, with a preference for paper over rock in a game of Rock-Paper-Scissors, is a confusion. In that case, the scenario would read, “My opponent can throw Rock, Paper, or Scissors; if we assume I don’t want to go Scissors (but my opponent does not know this), what should I do?” Within the given game, there are no intransitive preferences.