The Allais Paradox and the Dilemma of Utility vs. Certainty

Related to: The Allais Paradox, Zut Allais, Allais Malaise, and Pascal’s Mugging

You’ve probably heard the Allais Paradox before, where you choose one of the two options from each set:

Set One:

  1. $24000, with certainty.

  2. 97% chance of $27000, 3% chance of nothing.

Set Two:

  1. 34% chance of $24000, 66% chance of nothing.

  2. 33% chance of $27000, 67% chance of nothing.

From set one, which of the two would you choose? Which of the two is the most intuitively appealing? Which of the two would you choose if your only goal is to maximize the amount of dollars you receive? And most importantly, how do you justify your choice?
From set two, which of the two would you choose? Which of the two is the most intuitively appealing? Which of the two would you choose if your only goal is to maximize the amount of dollars you receive? And most importantly, how do you justify your choice?

The reason this is called a “paradox” is that most people choose 1 from set one and choose 2 from set two, despite set two being the same as a ~33% chance of being able to choose from set one.
This is best seen when we shut up and multiply. When we run some naïve expected utility calculations and make the big assumption of a linear utility for money (this works with the third question), we get:
U(Set One, Choice 1) = 1.00 * U($24000) = 24000
U(Set One, Choice 2) = 0.97 * U($27000) = 26190
U(Set Two, Choice 1) = 0.34 * U($24000) = 8160
U(Set Two, Choice 2) = 0.33 * U($27000) = 8910
So to the degree that it is rational to want more money (you can always donate anything you don’t want), it seems like we should want Choice 2 from both sets. But why do people only realize this in Set Two?
The two competing theories is the “people are silly” theory and the “it is perfectly rational to bet on certainty” theory. What if you go for the 97% chance and miss out on such a large sum? It seems like you would intuitively want to just take your $24000 and run, but according to expected utility, you’re just giving up $2190.
~

The Problem With “It is Perfectly Rational to Bet on Certainty”

To put some pressure on this theory, all we have to do is introduce set three right here:
Set Three:
  1. $24000, with certainty

  2. 99.99% chance of $24 million, 0.01% chance of nothing.

From set three, which of the two would you choose? Which of the two is the most intuitively appealing? Which of the two would you choose if your only goal is to maximize the amount of dollars you receive? And most importantly, how do you justify your choice?
I think you’d intuitively think that only a fool would cling to certainty so much that he or she wouldn’t be willing to take what is an almost guaranteed 24 million. Why is it okay to give up certainty on some bets and not others, regardless of what expected utility says?
If you had a choice between “$24000 with certainty” and “90% chance of $X”, is there really no value for X that would make you change your mind?
If you had a choice between “$24000 with certainty” and “X% chance of $24001″, what is the smallest value of X that would make you switch?
~

The Problem With “People Are Silly”

However, relying solely on expected utility seems to make you vulnerable to a dilemma very similar to Pascal’s Mugging. Consider set four where the difference is a lot more blatant:
Set Four:
  1. $24000, with certainty

  2. 0.0001% chance of $27 billion, 99.9999% chance of nothing.

From set four, which of the two would you choose? Which of the two is the most intuitively appealing? Which of the two would you choose if your only goal is to maximize the amount of dollars you receive? And most importantly, how do you justify your choice?

When we go solely by the expected utility calculations we get:
U(Set Three, Choice 1) = 1.00 * U($24000) = 24000
U(Set Three, Choice 2) = 0.000001 * U($27000000000) = 27000
Shutting up and multiplying tells us that if we go with Set Three, Choice 1 we are forfeiting $3000. Our intuition tells us that if we go with Set Three, Choice 2 we just chose a lottery ticket over $24000.

So here’s the real dilemma: you have to pay $10000 to play the game. The expected utility calculations now say choice 1 yields $14000 and choice 2 yields $17000.
So which choice do you take? And how do you defend your choice as the rational one?

And if your answer is that your utility for money is not linear, check to see if that’s your real rejection. What would you do if you would donate the money? What would you do if you were in the least convenient possible world where your utility function for money is linear?