Outline of Galef’s “Scout Mindset”
Julia Galef’s The Scout Mindset is superb.
For effective altruists, I think (based on the topic and execution) it’s straightforwardly the #1 book you should use when you want to recruit new people to EA. It doesn’t actually talk much about EA, but I think starting people on this book will result in an EA that’s thriving more and doing more good five years from now, compared to the future EA that would exist if the top go-to resource were more obvious choices like The Precipice, Doing Good Better, the EA Handbook, etc.
For rationalists, I think the best intro resource is still HPMoR or R:AZ, but I think Scout Mindset is a great supplement to those, and probably a better starting point for people who prefer Julia’s writing style over Eliezer’s.
I’ve made an outline of the book below, for my own reference and for others who have read it. If you don’t mind spoilers, you can also use this to help decide whether the book’s worth reading for you, though my summary skips a lot and doesn’t do justice to Julia’s arguments.
Scout mindset is “the motivation to see things as they are, not as you wish they were”.
We aren’t perfect scouts, but we can improve. “My approach has three prongs”:
Realize that truth isn’t in conflict with your other goals. People tend to overestimate how useful self-deception is for things like personal happiness and motivation, starting a company, being an activist, etc.
Learn tools that make it easier to see clearly. Use various kinds of thought experiments and probabilistic reasoning, and rethink how you go about listening to the “other side” of an issue.
Appreciate the emotional rewards of scout mindset. “It’s empowering to be able to resist the temptation to self-deceive, and to know that you can face reality even when it’s unpleasant. There’s an equanimity that results from understanding risk and coming to terms with the odds you’re facing. And there’s a refreshing lightness in the feeling of being free to explore ideas and follow the evidence wherever it leads”. Looking at lots of real-world examples of people who have exemplified scout mindset can make these positives more salient.
PART I: The Case for Scout Mindset
Chapter 1. Two Types of Thinking
“Can I believe it?” vs. “must I believe it?” In directionally motivated reasoning, often shortened to “motivated reasoning”, we disproportionately put our effort into finding evidence/reasons that support what we wish were true.
Reasoning as defensive combat. Motivated reasoning, a.k.a. soldier mindset, “doesn’t feel like motivated reasoning from the inside”. But it’s extremely common, as shown by how often we describe our reasoning in militaristic terms.
“Is it true?” An alternative to (directionally) motivated reasoning is accuracy motivated reasoning, i.e., scout mindset.
Your mindset can make or break your judgment. This stuff matters in real life, in almost every domain. Nobody is purely a scout or purely a soldier, but it’s possible to become more scout-like.
Chapter 2. What the Soldier is Protecting
“[I]f scout mindset is so great, why isn’t everyone already using it all the time?” Three emotional reasons:
Comfort: avoiding unpleasant emotions. This even includes comforting pessimism: “there’s no hope, so you might as well not worry about it.”
Self-esteem: feeling good about ourselves. Again, this can include ego-protecting negativity and avoiding “‘getting my hopes up’”.
Morale: motivating ourselves to do hard things.
And three social reasons:
Persuasion: convincing ourselves so we can convince others.
Image: choosing beliefs that make us look good. “Psychologists call it impression management, and evolutionary psychologists call it signaling: When considering a claim, we implicitly ask ourselves, ‘What kind of person would believe a claim like this, and is that how I want others to see me?’”
Belonging: fitting in to your social groups.
“We use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us”. So it’s no surprise that e.g. ‘training people in critical thinking’ don’t help change people’s thinking. But while “soldier mindset is often our default strategy for getting what we want”, it’s not generally the best strategy available.
Chapter 3. Why Truth is More Valuable Than We Realize
We make unconscious trade-offs. “[T]he whole point of self-deception is that it’s occurring beneath our conscious awareness. [...] So it’s left up to our unconscious minds to choose, on a case-by-case basis, which goals to prioritize.” Sometimes it chooses to be more soldier-like, sometimes more scout-like.
Are we rationally irrational? I.e., are we good at “unconsciously choosing just enough epistemic irrationality to achieve our [instrumental] social and emotional goals, without impairing our judgment too much”? No. “There are several major biases in our decision-making [… that] cause us to overvalue soldier mindset”:
We overvalue the immediate rewards of soldier mindset. Present bias: we prefer small rewards now over large rewards later.
We underestimate the value of building scout habits. Cognitive skills are abstract (and, again, have most of their benefits in the future), so they’re harder to notice and care about.
We underestimate the ripple effects of self-deception. These ripple effects are “delayed and unpredictable”, which is “exactly the kind of cost we tend to neglect”.
We overestimate social costs.
An accurate map is more useful now. Humans have more options now than we did tens of thousands of years ago, and more ability to improve our circumstances. “So if our instincts undervalue truth, that’s not surprising—our instincts evolved in a different world, one better suited to the soldier.”
PART II: Developing Self-Awareness
Chapter 4. Signs of a Scout
“A key factor preventing us from being in scout mindset more frequently is our conviction that we’re already in it.” Examples of “things that make us feel like scouts even when we’re not”:
Feeling objective doesn’t make you a scout.
Being smart and knowledgeable doesn’t make you a scout. On ideologically charged questions, learning more tends to make people more polarized; and even scientists studying cognitive biases have a track record of exhibiting soldier mindset.
Actually practicing scout mindset makes you a scout. “The test of scout mindset isn’t whether you see yourself as the kind of person who [changes your mind in response to evidence, is fair-minded, etc. …] It’s whether you can point to concrete cases in which you did, in fact, do those things. [...] The only real sign of a scout is whether you act like one.” Behavioral cues to look for:
Do you tell other people when you realize they were right?
How do you react to personal criticism? “Are there examples of criticism you’ve acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you?”
Do you ever prove yourself wrong?
Do you take precautions to avoid fooling yourself? E.g., “Do you avoid biasing the information you get?” and “[D]o you decide ahead of time what will count as a success and what will count as a failure, so you’re not tempted to move the goalposts later?”
Do you have any good critics? “Can you name people who are critical of your beliefs, profession, or life choices who you consider thoughtful, even if you believe they’re wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable[...]?”
“But the biggest sign of scout mindset may be this: Can you point to occasions in which you were in soldier mindset? [… M]otivated reasoning is our natural state,” so if you never notice yourself doing it, the likeliest explanation is that you’re not self-aware about it.
Chapter 5. Noticing Bias
“One of the essential tools in a magician’s tool kit is a form of manipulation called forcing.” The magician asks you to choose between two cards. “If you point to the card on the left, he says, ‘Okay, that one’s yours.’ If you point to the card on the right, he says, ‘Okay, we’ll remove that one.’ [...] If you could see both of those possible scenarios at once, the trick would be obvious. But because you end up in only one of those worlds, you never realize.”
“Forcing is what your brain is doing to get away with motivated reasoning while still making you feel like you’re being objective.” The Democratic voter doesn’t notice that they’re going easier on a Democratic politician than they would on a Republican, because the question “How would I act if this politician were a Republican?” isn’t salient to them, or they’re tricking themselves into thinking they’d apply the same standard.
A thought experiment is a peek into the counterfactual world. “You can’t detect motivated reasoning in yourself just by scrutinizing your reasoning and concluding that it makes sense. You have to compare your reasoning to the way you would have reasoned in a counterfactual world, a world in which your motivations were different—would you judge that politician’s actions differently if he was in the opposite party? [...] Would you consider that study’s methodology sound if its conclusion supported your side? [...] Try to actually imagine the counterfactual scenario. [… D]on’t simply formulate a verbal question for yourself. Conjure up the counterfactual world, place yourself in it, and observe your reaction.” Five types of thought experiment:
The double standard test. Am I judging one person/group by a standard I wouldn’t apply to another person/group?
The outsider test. “Imagine someone else stepped into your shoes—what do you expect they would do in your situation?” Or imagine that you’re an outsider who just magically teleported into your body.
The conformity test. “If other people no longer held this view, would you still hold it?”
The selective skeptic test. “Imagine this evidence supported the other side. How credible would you find it then?”
The status quo bias test. “Imagine your current situation was no longer the status quo. Would you then actively choose it?”
Thought experiments on their own “can’t tell you what’s true or fair or what decision you should make.” But they allow you to catch your brain “in the act of motivated reasoning,” and take that into account as you work to figure out what’s true.
Beyond the specific thought experiments, the core skill of this chapter is “a kind of self-awareness, a sense that your judgments are contingent—that what seems true or reasonable or fair or desirable can change when you mentally vary some features of the question that should have been irrelevant.”
Chapter 6. How Sure Are You?
We like feeling certain. “Your strength as a scout is in your ability [...] to think in shades of gray instead of black and white. To distinguish the feeling of ’95% sure’ from ‘75% sure’ from ‘55% sure’.”
Quantifying your uncertainty. For scouts, probabilities are predictions of how likely they are to be right. The goal is to be calibrated in the probabilities you assign.
A bet can reveal how sure you really are. “Evolutionary psychologist Robert Kurzban has an analogy[...] In a company, there’s a board of directors whose role is to make the crucial decisions for the company—how to spend its budget, which risks to take, when to change strategies, and so on. Then there’s a press secretary whose role it is to give statements[...] The press secretary makes claims; the board makes bets. [...] A bet is any decision in which you stand to gain or lose something of value, based on the outcome.”
The equivalent bet test. By comparing different bets and seeing when you prefer taking one vs. the other, or when they feel about the same, you can translate your feeling “does X sound like a good bet?” into probabilities.
The core skill of this chapter is “being able to tell the difference between the feeling of making a claim and the feeling of actually trying to guess what’s true.”
PART III: Thriving Without Illusions
Chapter 7. Coping with Reality
Keeping despair at bay. Motivated reasoning is especially tempting in emergencies; but it’s also especially dangerous in emergencies. In dire situations, it’s essential to be able to keep despair at bay without distorting your map of reality. E.g., you can count your blessings, come to terms with your situation, or remind yourself that you’re doing the best you can.
Honest vs. self-deceptive ways of coping. Honest ways of coping with painful or difficult circumstances include:
Make a plan. “It’s striking how much the urge to conclude ‘That’s not true’ diminishes once you feel like you have a concrete plan for what you would do if the thing were true.”
Notice silver linings. “You’re recognizing a silver lining to the cloud, not trying to convince yourself the whole cloud is silver. But in many cases, that’s all you need”.
Focus on a different goal.
Things could be worse.
Does research show that self-deceived people are happier? No, the research quality is terrible.
Chapter 8. Motivation Without Self-Deception
Using self-deception to motivate yourself is bad, because:
An accurate picture of your odds helps you choose between goals.
An accurate picture of the odds helps you adapt your plan over time.
An accurate picture of the odds helps you decide how much to stake on success.
Bets worth taking. “[S]couts aren’t motivated by the thought, ‘This is going to succeed.’ They’re motivated by the thought, ‘This is a bet worth taking.’” Which bets are worth taking is a matter of their expected value.
Accepting variance gives you equanimity. Expecting to always succeed is unrealistic, and will lead to unnecessary disappointments. “Instead of being elated when your bets pay off, and crushed when they don’t,” try to get a realistic picture of the variance in bets and focus on ensuring your bets have high expected value.
Coming to terms with the risk.
Chapter 9. Influence Without Overconfidence
Two types of confidence. Epistemic confidence is “how sure you are about what’s true,” while social confidence is self-assurance: “Are you at ease in social situations? Do you act like you deserve to be there, like you’re secure in yourself and your role in the group? Do you speak as if you’re worth listening to?” Influencing people requires social confidence, which people conflate with epistemic confidence.
People judge you on social confidence, not epistemic confidence. Various studies show that judgments of competence are mediated by perceived social (rather than epistemic) confidence.
Two kinds of uncertainty. People trust you less if you seem uncertain due to ignorance or inexperience, but not if you seem uncertain due to reality being messy and unpredictable. Three ways to communicate uncertainty without looking inexperienced or incompetent:
Show that uncertainty is justified.
Give informed estimates. “Even if reality is messy and it’s impossible to know the right answer with confidence, you can at least be confident in your analysis.”
Have a plan.
You don’t need to promise success to be inspiring. “You can paint a picture of the world you’re trying to create, or why your mission is important, or how your product has helped people, without claiming you’re guaranteed to succeed. There are lots of ways to get people excited that don’t require you to lie to others or to yourself.”
“That’s the overarching theme of these last three chapters: whatever your goal, there’s probably a way to get it that doesn’t require you to believe false things.”
PART IV: Changing Your Mind
Chapter 10. How to Be Wrong
Change your mind a little at a time. Superforecasters constantly revise their views in small ways.
Recognizing you were wrong makes you better at being right. Most people, when they learn they were wrong, give excuses like “I Was Almost Right”. Superforecasters instead “reevaluate their process, asking, ‘What does this teach me about how to make better forecasts?’”
Learning domain-general lessons. Even if your error is in a domain that seems unimportant to you, noticing your errors can teach domain-general lessons “about how the world works, or how your own brain works, and about the kinds of biases that tend to influence your judgment.” Or they can help you fully internalize a lesson you previously only believed in the abstract.
“Admitting a mistake” vs. “updating”. Being factually wrong about something doesn’t necessarily mean you screwed up. Learning new information should usually be thought of in matter-of-fact terms, as an opportunity to update your beliefs—not as something humbling or embarrassing.
If you’re not changing your mind, you’re doing something wrong. By default, you should be learning more over time, and changing your strategy accordingly.
Chapter 11. Lean in to Confusion
Usually, “we react to observations that conflict with our worldview by explaining them away. [...] We couldn’t function in the world if we were constantly questioning our perception of reality. But especially when motivated reasoning is in play, we take it too far, shoehorning conflicting evidence into a narrative” well past the point where it makes sense. “This chapter is about how to resist the urge to dismiss details that don’t fit your theories, and instead, allow yourself to be confused and intrigued by them, to see them as puzzles to be solved”.
“You don’t know in advance” what surprising and confusing observations will teach you. “All too often, we assume the only two possibilities are ‘I’m right’ or ‘The other guy is right’[...] But in many cases, there’s an unknown unknown, a hidden ‘option C,’ that enriches our picture of the world in a way we wouldn’t have been able to anticipate.”
Anomalies pile up and cause a paradigm shift. “Acknowledge anomalies, even if you don’t yet know how to explain them, and even if the old paradigm still seems correct overall. Maybe they’ll add up to nothing in particular. Maybe they just mean that reality is messy. But maybe they’re laying the groundwork for a big change of view.”
Be willing to stay confused.
Chapter 12. Escape Your Echo Chamber
How not to learn from disagreement. Listening to the “other side” usually makes people more polarized. “By default, we end up listening to people who initiate disagreements with us, as well as the public figures and media outlets who are the most popular representatives of the other side.” But people who initiate disagreements tend to be unusually disagreeable, and popular representatives of an ideology are often “ones who do things like cheering for their side and mocking or caricaturing the other side—i.e., you”. To learn from disagreement:
Listen to people you find reasonable.
Listen to people you share intellectual common ground with.
Listen to people who share your goals.
The problem with a “team of rivals.” “Dissent isn’t all that useful from people you don’t respect or from people who don’t even share enough common ground with you to agree that you’re supposed to be on the same team.”
It’s harder than you think. “We assume that if both people are basically reasonable and arguing in good faith, then getting to the bottom of a disagreement should be straightforward[...] When things don’t play out that way [...] everyone gets frustrated and concludes the others must be irrational.” But even under ideal conditions, learning from disagreements is still hard, e.g., because:
We misunderstand each other’s views.
Bad arguments inoculate us against good arguments.
Our beliefs are interdependent—changing one requires changing others.
PART V: Rethinking Identity
Chapter 13. How Beliefs Become Identities
What it means for something to be part of your identity. Criticizing part of someone’s identity tends to spark passionate, combative, and defensive reactions. Two things that tend to turn a belief into an identity:
Signs a belief might be an identity.
Using the phrase “I believe”.
Getting annoyed when an ideology is criticized.
A righteous tone.
Having to defend your view.
“Identifying with a belief makes you feel like you have to be ready to defend it, which motivates you to focus your attention on collecting evidence in its favor. Identity makes you reflexively reject arguments that feel like attacks on you or on the status of your group. [...] And when a belief is part of your identity it becomes far harder to change your mind[.]”
Chapter 14. Hold Your Identity Lightly
What it means to hold your identity lightly. Rather than trying to have no identities, you should try to “keep those identities from colonizing your thoughts and values. [...] Holding your identity lightly means thinking of it in a matter-of-fact way, rather than as a central source of pride and meaning in your life. It’s a description, not a flag to be waved proudly.”
Could you pass an ideological Turing test? Passing means explaining an ideology “as a believer would, convincingly enough that other people couldn’t tell the difference between you and a genuine believer”. The ideological Turing test tests your knowledge of the other side’s beliefs, but “it also serves as an emotional test: Do you hold your identity lightly enough to be able to avoid caricaturing your ideological opponents?”
A strongly held identity prevents you from persuading others.
Understanding the other side makes it possible to change minds.
Is holding your identity lightly compatible with activism? Activists usually “face trade-offs between identity and impact,” and holding your identity lightly can make it easier to focus on the highest-impact options.
Chapter 15. A Scout Identity
Flipping the script on identity. Identifying as a truth-seeker can make you a better scout.
Identity makes hard things rewarding. When you act like a scout, you can take pride and satisfaction in living up to your values. This short-term reward helps patch our bias for short-term rewards, which normally favors soldier mindset.
Your communities shape your identity. “[I]n the medium-to-long term, one of the biggest things you can do to change your thinking is to change the people you surround yourself with.”
You can choose what kind of people you attract. You can’t please everyone, so “you might as well aim to please the kind of people you’d most like to have around you, people who you respect and who motivate you to be a better version of yourself”.
You can choose your communities online.
You can choose your role models.