Well, if someone knows about systematic biases that don’t go away with incentivization, they’re probably too busy making money off that insight to comment here!
In practice, when the stakes are high, it is not so much that people start thinking more accurately—though this will happen to some extent, and for some people dramatically so—but rather that they become more cautious.
If you take ordinary folks into the lab and ask them questions they don’t care about, it’s easy to get them to commit all sorts of logical errors. However, if you approach them with a serious deal where some bias identified in the lab would lead them to accept unfavorable terms with real consequences, they won’t trust their unreliable judgments, and instead they’ll ask for third-party advice and see what the normal and usual way to handle such a situation is. If no such guidance is available, they’ll fall back on the status quo heuristic. People hate to admit their intellectual limitations explicitly, but they’re good at recognizing them instinctively before they get themselves into trouble by relying on their faulty reasoning too much.
This is why for all the systematic biases discussed here, it’s extremely hard to actually exploit these biases in practice to make money. It also explains how market bubbles and Ponzi schemes can lead to such awful collective insanity: as the snowball keeps rolling and growing, people see others massively falling for the scam, and conclude that it must be a safe and sound option if all these other normal and respectable folks are doing it. The caution/normality/status quo heuristics break down in this situation.
People hate to admit their intellectual limitations explicitly, but they’re good at recognizing them instinctively before they get themselves into trouble by relying on their faulty reasoning too much.
Yeah… this is what Bryan Caplan says in The Myth of the Rational Voter
If you read the Book on Checklists (which I HIGHLY recommend to everyone) its filled up with examples of a) people getting consistently saved due to checklists and b) trained professionals not wanting to bother with such a mundane thing.
Most of these actually care about their patents still.
Also in the book was an example of a baseball team figuring out how the standard scoring methods are slightly off, finding a better one and then buying up undervalued players.
The more I look, the more examples of professionals ignoring sound methods to be reliably more successful I find and so I think opportunities for un-biased action are everywhere. But many don’t really lead to that much more success.
when the stakes are high, it is not so much that people start thinking more accurately—though this will happen to some extent, and for some people dramatically so
So this actually contradicts the paper that Eliezer cited, or at least seems to, yet it seems to ring true. Not only does the hypothesis that incentive reduces bias seem convincing to you and I, but it also forms a whole Chapter of Caplan’s book “The Myth of the Rational Voter”.
I think I’m going to split the prize between this comment and Eliezer Yudkowsky’s one. Feel free to PM me and claim your prize.
So this [that people become more cautious when the stakes are high] actually contradicts the paper that Eliezer cited, or at least seems to, yet it seems to ring true.
There is no contradiction. This paper shows that when stakes increase, people start thinking somewhat more accurately, but not drastically so—which is exactly what I wrote above.
What these researchers did was not the sort of thing that triggers people’s caution/normality/status-quo heuristics that I had in mind. They put people in a situation where they stood only to gain free money, and were forced to choose between several options, each with a guaranteed non-negative outcome. A study that would actually test my claims would observe people in a situation where they could lose a significant amount of their own money by accepting a bad deal based on biased reasoning.
Of course, this actually happens sometimes, for example with people who ruin themselves by compulsive gambling. But these are rare exceptions, not instances of all-pervasive systematic biases.
Feel free to PM me and claim your prize.
I don’t have a Paypal account, but you can buy me a beer next time I’m over in the U.K. :-)
I’m prepared to double-or-nothing that $5 on the claim that loss aversion would kick in and they’d go insane and do even worse than in the winnings-only study.
However, if you approach them with a serious deal where some bias identified in the lab would lead them to accept unfavorable terms with real consequences, they won’t trust their unreliable judgments, and instead they’ll ask for third-party advice and see what the normal and usual way to handle such a situation is. If no such guidance is available, they’ll fall back on the status quo heuristic. People hate to admit their intellectual limitations explicitly, but they’re good at recognizing them instinctively before they get themselves into trouble by relying on their faulty reasoning too much. This is why for all the systematic biases discussed here, it’s extremely hard to actually exploit these biases in practice to make money.
Yeah… that sounds right. Also, suppose that you have an irrational stock price. One or two contrarians can’t make much more than double their stake money out of it, because if they go leveraged, the market might get more irrational before it gets less irrational, and wipe their position.
Well, if someone knows about systematic biases that don’t go away with incentivization, they’re probably too busy making money off that insight to comment here!
In practice, when the stakes are high, it is not so much that people start thinking more accurately—though this will happen to some extent, and for some people dramatically so—but rather that they become more cautious.
If you take ordinary folks into the lab and ask them questions they don’t care about, it’s easy to get them to commit all sorts of logical errors. However, if you approach them with a serious deal where some bias identified in the lab would lead them to accept unfavorable terms with real consequences, they won’t trust their unreliable judgments, and instead they’ll ask for third-party advice and see what the normal and usual way to handle such a situation is. If no such guidance is available, they’ll fall back on the status quo heuristic. People hate to admit their intellectual limitations explicitly, but they’re good at recognizing them instinctively before they get themselves into trouble by relying on their faulty reasoning too much.
This is why for all the systematic biases discussed here, it’s extremely hard to actually exploit these biases in practice to make money. It also explains how market bubbles and Ponzi schemes can lead to such awful collective insanity: as the snowball keeps rolling and growing, people see others massively falling for the scam, and conclude that it must be a safe and sound option if all these other normal and respectable folks are doing it. The caution/normality/status quo heuristics break down in this situation.
Yeah… this is what Bryan Caplan says in The Myth of the Rational Voter
If you read the Book on Checklists (which I HIGHLY recommend to everyone) its filled up with examples of a) people getting consistently saved due to checklists and b) trained professionals not wanting to bother with such a mundane thing. Most of these actually care about their patents still.
Also in the book was an example of a baseball team figuring out how the standard scoring methods are slightly off, finding a better one and then buying up undervalued players. The more I look, the more examples of professionals ignoring sound methods to be reliably more successful I find and so I think opportunities for un-biased action are everywhere. But many don’t really lead to that much more success.
What do you mean by “Book on Checklists”? Atul Gawande’s “The Checklist Manifesto”?
Yes.
So this actually contradicts the paper that Eliezer cited, or at least seems to, yet it seems to ring true. Not only does the hypothesis that incentive reduces bias seem convincing to you and I, but it also forms a whole Chapter of Caplan’s book “The Myth of the Rational Voter”.
I think I’m going to split the prize between this comment and Eliezer Yudkowsky’s one. Feel free to PM me and claim your prize.
Roko:
There is no contradiction. This paper shows that when stakes increase, people start thinking somewhat more accurately, but not drastically so—which is exactly what I wrote above.
What these researchers did was not the sort of thing that triggers people’s caution/normality/status-quo heuristics that I had in mind. They put people in a situation where they stood only to gain free money, and were forced to choose between several options, each with a guaranteed non-negative outcome. A study that would actually test my claims would observe people in a situation where they could lose a significant amount of their own money by accepting a bad deal based on biased reasoning.
Of course, this actually happens sometimes, for example with people who ruin themselves by compulsive gambling. But these are rare exceptions, not instances of all-pervasive systematic biases.
I don’t have a Paypal account, but you can buy me a beer next time I’m over in the U.K. :-)
I’m prepared to double-or-nothing that $5 on the claim that loss aversion would kick in and they’d go insane and do even worse than in the winnings-only study.
Yeah… that sounds right. Also, suppose that you have an irrational stock price. One or two contrarians can’t make much more than double their stake money out of it, because if they go leveraged, the market might get more irrational before it gets less irrational, and wipe their position.
“The market can stay irrational longer than you can stay solvent.”—John Maynard Keynes