Review of Kahneman, ‘Thinking, Fast and Slow’ (2011)

Thinking, Fast and Slow is Kahneman’s first book for a general audience, and a summary of his far-reaching and important work. Over the course of about 400 pages (this does not include the appendices, notes, or index), Kahneman explains his current views on: System 1 vs. System 2 thinking, heuristics and biases, overconfidence, decision making under uncertainty, the differences between the experiencing self and remembering self, and the implications of combining all this knowledge.

In short: If you care about improving your thinking and decision making, and thus you care about the cognitive science of rationality, then you are likely to enjoy — and benefit from — this book. And if you know people who won’t read the Core Sequences, getting them to read Thinking, Fast and Slow will take them 30% of the way.

Kahneman leaps deftly between demonstration (“try this word problem, notice what your brain does”), theory, and research stories. He covers dozens of issues likely to familiar to veteran LWers, and perhaps a dozen more that have never been discussed on Less Wrong: availability cascades, causal stereotyping, illusion of validity, the stuff on expert intuition from chapter 22, duration neglect, the peak-end effect, affective forecasting and “miswanting,”

Each chapter ends with snippets of fictional dialogue, showing what it would like to use the concepts introduced in that chapter in everyday speech. What is remarkable is how much these snippets sound like things I hear in daily conversations at Singularity Institute. For example:

  • “What came quickly to my mind was an intuition from System 1. I’ll have to start over and search my memory deliberately.”

  • “She knows nothing about this person’s management skills. All she is going by is the halo effect from a good presentation.”

  • “Do we still remember the question we are trying to answer? Or have we substituted an easier one?”

  • “This start-up looks as if it could not fail, but the base rate of success in the industry is extremely low. How do we know this case is different?”

  • “Let’s reframe the problem by changing the reference point. Imagine we did not own it; how much would we think it is worth?”

Other dialogue snippets from Kahneman’s book are considered so obvious within Singularity Institute that sentences similar to Kahneman’s snippets are often half-spoken before somebody interrupts and moves on because everyone in the room already knows the rest of the sentence, and everybody knows that everybody else knows the rest of the sentence:

  • “They were primed to find flaws, and this is exactly what they found.”

  • “He underestimates the risks of indoor pollution because there are few media stories on them. That’s an availability effect. He should look at the statistics.”

  • “The mistake appears obvious, but it is just hindsight. You could not have known in advance.”

  • “He’s taking an inside view. He should forget about his own case and look for what happened in other cases.”

  • “He weighs losses about twice as much as gains, which is normal.”

Other dialogue snippets from the book are even more obvious within Singularity Institute, and they can be communicated merely by raising an eyebrow at what someone has said:

  • “This is your System 1 talking. Slow down and let your System 2 take control.”

  • “The sample of observations is too small to make any inferences. Let’s not follow the law of small numbers.”

In the final chapter, Kahneman reflects on the good news that his and his colleagues’ work is having an effect at the policy level. As a result of a book he wrote with Richard Thaler, Nudge: Improving Decisions about Health, Wealth, and Happiness, Cass Sunstein was invited by President Obama to be the administrator of the Office of Information and Regulatory Affairs. From that post Sunstein has successfully implemented many new policies that treat humans as humans instead of as members of Homo economicus:

...applications that have been implemented [by Sunstein] include automatic enrollment in health insurance, a new version of the dietary guidelines that replaces the incomprehensible Food Pyramid with the powerful image of a Food Plate loaded with a balanced diet, and a rule formulated by the USDA that permits the inclusion of messages such as “90% fat-free” on the label of meat products, provided that the statement “10% fat” is also displayed “contiguous to, in lettering of the same color, size, and type as, and on the same color background as, the statement of lean percentage.”

The British government has also responded by forming a special unit dedicated to applying decision science to successful policy-making. Officially it is called the Behavioural Insight Team, but internally people just call it the Nudge Unit.