Rationality Reading Group: Part J: Death Spirals

This is part of a semi-monthly reading group on Eliezer Yudkowsky’s ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.


Welcome to the Rationality reading group. This fortnight we discuss Part J: Death Spirals (pp. 409-494). This post summarizes each article of the sequence, linking to the original LessWrong post where available.

J. Death Spirals

100. The Affect HeuristicPositive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.

101. Evaluability (and Cheap Holiday Shopping) - It’s difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.

102. Unbounded Scales, Huge Jury Awards, and FuturismWithout a metric for comparison, estimates of, e.g., what sorts of punitive damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.

103. The Halo EffectPositive qualities seem to correlate with each other, whether or not they actually do.

104. Superhero BiasIt is better to risk your life to save 200 people than to save 3. But someone who risks their life to save 3 people is revealing a more altruistic nature than someone risking their life to save 200. And yet comic books are written about heroes who save 200 innocent schoolchildren, and not police officers saving three prostitutes.

105. Mere MessiahsJohn Perry, an extropian and a transhumanist, died when the north tower of the World Trade Center fell. He knew he was risking his existence to save other people, and he had hope that he might be able to avoid death, but he still helped them. This takes far more courage than someone who dies, expecting to be rewarded in an afterlife for their virtue.

106. Affective Death SpiralsHuman beings can fall into a feedback loop around something that they hold dear. Every situation they consider, they use their great idea to explain. Because their great idea explained this situation, it now gains weight. Therefore, they should use it to explain more situations. This loop can continue, until they believe Belgium controls the US banking system, or that they can use an invisible blue spirit force to locate parking spots.

107. Resist the Happy Death SpiralYou can avoid a Happy Death Spiral by (1) splitting the Great Idea into parts (2) treating every additional detail as burdensome (3) thinking about the specifics of the causal chain instead of the good or bad feelings (4) not rehearsing evidence (5) not adding happiness from claims that “you can’t prove are wrong”; but not by (6) refusing to admire anything too much (7) conducting a biased search for negative points until you feel unhappy again (8) forcibly shoving an idea into a safe box.

108. Uncritical SupercriticalityOne of the most dangerous mistakes that a human being with human psychology can make, is to begin thinking that any argument against their favorite idea must be wrong, because it is against their favorite idea. Alternatively, they could think that any argument that supports their favorite idea must be right. This failure of reasoning has led to massive amounts of suffering and death in world history.

109. Evaporative Cooling of Group BeliefsWhen a cult encounters a blow to their own beliefs (a prediction fails to come true, their leader is caught in a scandal, etc) the cult will often become more fanatical. In the immediate aftermath, the cult members that leave will be the ones who were previously the voice of opposition, skepticism, and moderation. Without those members, the cult will slide further in the direction of fanaticism.

110. When None Dare Urge RestraintThe dark mirror to the happy death spiral is the spiral of hate. When everyone looks good for attacking someone, and anyone who disagrees with any attack must be a sympathizer to the enemy, the results are usually awful. It is too dangerous for there to be anyone in the world that we would prefer to say negative things about, over saying accurate things about.

111. The Robbers Cave ExperimentThe Robbers Cave Experiment, by Sherif, Harvey, White, Hood, and Sherif (1954/​1961), was designed to investigate the causes and remedies of problems between groups. Twenty-two middle school aged boys were divided into two groups and placed in a summer camp. From the first time the groups learned of each other’s existence, a brutal rivalry was started. The only way the counselors managed to bring the groups together was by giving the two groups a common enemy. Any resemblance to modern politics is just your imagination.

112. Every Cause Wants to Be a CultThe genetic fallacy seems like a strange kind of fallacy. The problem is that the original justification for a belief does not always equal the sum of all the evidence that we currently have available. But, on the other hand, it is very easy for people to still believe untruths from a source that they have since rejected.

113. Guardians of the TruthThere is an enormous psychological difference between believing that you absolutely, certainly, have the truth, versus trying to discover the truth. If you believe that you have the truth, and that it must be protected from heretics, torture and murder follow. Alternatively, if you believe that you are close to the truth, but perhaps not there yet, someone who disagrees with you is simply wrong, not a mortal enemy.

114. Guardians of the Gene PoolIt is a common misconception that the Nazis wanted their eugenics program to create a new breed of supermen. In fact, they wanted to breed back to the archetypal Nordic man. They located their ideals in the past, which is a counterintuitive idea for many of us.

115. Guardians of Ayn RandAyn Rand, the leader of the Objectivists, praised reason and rationality. The group she created became a cult. Praising rationality does not provide immunity to the human trend towards cultishness.

116. Two Cult KoansTwo Koans about individuals concerned that they may have joined a cult.

117. Asch’s Conformity ExperimentThe unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what’s right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.

118. On Expressing Your ConcernsA way of breaking the conformity effect in some cases.

119. Lonely DissentJoining a revolution does take courage, but it is something that humans can reliably do. It is comparatively more difficult to risk death. But it is more difficult than either of these to be the first person in a rebellion. To be the only one who is saying something different. That doesn’t feel like going to school in black. It feels like going to school in a clown suit.

120. Cultish CountercultishnessPeople often nervously ask, “This isn’t a cult, is it?” when encountering a group that thinks something weird. There are many reasons why this question doesn’t make sense. For one thing, if you really were a member of a cult, you would not say so. Instead, what you should do when considering whether or not to join a group, is consider the details of the group itself. Is their reasoning sound? Do they do awful things to their members?


This has been a collection of notes on the assigned sequence for this fortnight. The most important part of the reading group though is discussion, which is in the comments section. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!

The next reading will cover Part K: Letting Go (pp. 497-532). The discussion will go live on Wednesday, 7 October 2015, right here on the discussion forum of LessWrong.