First, want to be clear that our goals, as I described earlier, are not to get people to Less Wrong necessarily. There are dangers of Endless September if we do that. Our primary goal is to spread rationality ideas to a broad audience. Doing so does not necessarily involve overcoming the “great barrier,” but involves couching rationality in the language of science-based self-improvement, as I do in this article, shared over 1K times.
This gets at the broader point—I think that rationality is a spectrum, in line with Keith Stanovich’s research. So our aim is to raise the rationality IQ of the population. The metaphor of “great barrier” is thus not in line with the actual research on rationality and how it functions.
Now, getting to the question of Less Wrong. What we aim to do is gradually move people up the level of complexity, and eventually have some who have chosen to move up this level engage with Less Wrong. We don’t assume that all or even most or even 10% will do so, but some will. In fact, some already have started to engage with Less Wrong, reading the Sequences, etc. This is only after they have received adequate training to help them cross the inference gap.
Far from all people are interested in this level of high-brow engagement, and that’s ok! As long as we raise the rationality IQ—the sanity waterline—we’re doing what we set out to do.
Great question!
First, want to be clear that our goals, as I described earlier, are not to get people to Less Wrong necessarily. There are dangers of Endless September if we do that. Our primary goal is to spread rationality ideas to a broad audience. Doing so does not necessarily involve overcoming the “great barrier,” but involves couching rationality in the language of science-based self-improvement, as I do in this article, shared over 1K times.
This gets at the broader point—I think that rationality is a spectrum, in line with Keith Stanovich’s research. So our aim is to raise the rationality IQ of the population. The metaphor of “great barrier” is thus not in line with the actual research on rationality and how it functions.
Now, getting to the question of Less Wrong. What we aim to do is gradually move people up the level of complexity, and eventually have some who have chosen to move up this level engage with Less Wrong. We don’t assume that all or even most or even 10% will do so, but some will. In fact, some already have started to engage with Less Wrong, reading the Sequences, etc. This is only after they have received adequate training to help them cross the inference gap.
Far from all people are interested in this level of high-brow engagement, and that’s ok! As long as we raise the rationality IQ—the sanity waterline—we’re doing what we set out to do.