[Question] What should experienced rationalists know?

The obvious answer is ‘the sequences’ but imo that is neither necessary nor sufficient. The Sequences are valuable but they are quite old at this point. They also run to over a million words (though Rationality AtZ is only ~600k). Here is a list of core skills and ideas:

1 - Rationality Techniques

Ideally, an experienced rationalist would have experience with most of the CFAR manual. Anyone trying to learn this material needs to actually try the techniques; theoretical knowledge is not enough. If I had to make a shorter list of techniques I would include:

  • Double Crux /​ Internal DC

  • Five-minute Timers

  • Trigger Action Plans

  • Bucket Errors

  • Goal/​Aversion Factoring

  • Gears Level Understanding

  • Negative Visualisation /​ Murphy-Jitsu

  • Focusing

2 - AI Risk: Superintelligence

The rationality community was founded to help solve AI risk. Superintelligence gives an updated and complete version of the ‘classic’ argument for AI-risk. Superintelligence does not make as many strong claims about takeoff as Elizer’s early writings. This seems useful given that positions closer to Paul Christiano’s seem to be gaining prominence. I think the ‘classic’ arguments are still very much worth understanding. On the other hand, Superintelligence is ~125K words and not easy reading.

I think many readers can skip the first few chapters. The core argument is in chapters five through fourteen.

5. Decisive strategic advantage

6. Cognitive superpowers

7. The superintelligent will

8. Is the default outcome doom?

9. The control problem

10. Oracles, genies, sovereigns, tools

11. Multipolar scenarios

12. Acquiring values

13. Choosing the criteria for choosing

14. The strategic picture

3 - Cognitive Biases: Thinking Fast and Slow

Priming is the first research area discussed in depth in TF&S. Priming seems to be almost entirely BS. I would suggest skipping the chapter on priming and remembering the discussion of the ‘hot hand fallacy’ seems incorrect. Another potential downside is the length (~175K words). However, I don’t think there is a better source overall. Many of the concepts in TF&S remain fundamental. The writing is also quite good and the historical value is extremely high. Here is a quick review from 2016.

4 - Statistics

It is hard to be an informed rationalist without a basic understanding of Bayesian thinking. You need to understand frequentist statistics to evaluate a lot of relevant research. Some of the most important concepts/​goals are listed below.

Bayesian Statistics:

  • Illustrate the use of odd’s ratio calculation in practical situations

  • Derive Laplace’s rule of succession

Frequentist Stats—Understand the following concepts:

  • Law of large numbers

  • Power, p-values, t-tests, z-tests

  • Linear Regression

  • Limitations of the above concepts

5 - Signalling /​ The Elephant in the Brain

The Elephant in the Brain is a clear and authoritative source. The ideas discussed have certainly been influential in the rationalist community. But I am not what epistemic status the community assigns to the Hanson/​Simler theories around signaling. Any opinions? For reference here are the topics.

PART I Why We Hide Our Motives

  • 1 Animal Behavior

  • 2 Competition

  • 3 Norms

  • 4 Cheating

  • 5 Self-Deception

  • 6 Counterfeit Reasons

PART II Hidden Motives in Everyday Life

  • 7 Body Language

  • 8 Laughter

  • 9 Conversation

  • 10 Consumption

  • 11 Art

  • 12 Charity

  • 13 Education

  • 14 Medicine

  • 15 Religion

  • 16 Politics

  • 17 Conclusion

What am I missing? Try to be as specific as possible about what exactly should be learned. Some possible topics discussed in the community include:

  • Economics

  • The basics of the other EA cause areas and general theory? (at least the stuff in ‘Doing Good Better’)

  • Eliezer says to study evolutionary psychology in the eleventh virtue but I have not been impressed with evo-psych.

  • Something about mental tech? Maybe mindfulness, Internal Family Systems, or circling? I am not confident anything in space fits.