Very well said. I also think more is possible—not nearly as much more as I originally thought, but there is always room for improvement and I do think there’s a real possibility that community effects can be huge. I mean, individual humans are smarter than individual animals, but the real advantages have accrued though society, specialization, teamwork, passing on knowledge, and sharing technology—all communal activities.
And yeah, probably the main barrier boils down to the things you mentioned. People who are interested in self-improvement and truth are a small subset of the population[1]. Across the country/world there are lots of them, but humans have some psychological thing about meeting face to face, and the local density most places is below critical mass. And having people move to be closer together would be a big ask even if they were already great friends, which the physical distance makes difficult. As far as I can see, the possible options are: 1. Move into proximity (very costly) 2. Start a community with the very few nearby rationalists (difficult to keep any momentum) 3. Start a community with nearby non-rationalists (could be socially rewarding, but likely to dampen any rationality advantage) 4. Teach people nearby to be rational (ideal, but very difficult) 5. Build an online community (LW is doing this. Could try video meetings, but I predict it would still feel less connected that in person and make momentum difficult) 5b. Try to change your psychology so that online feels like in person. (Also, difficult) 6. Do it without a community (The default, but limited effectiveness)
So, I don’t know—maybe when AR gets really good we could all hang out in the “metaverse” and it will feel like hanging out in person. Maybe even then it won’t—maybe it’s just literally having so many other options that makes the internet feel impersonal. If so, weird idea—have LW assign splinter groups and that’s the only group you get (maybe you can move groups, but there’s some waiting period so you can’t ‘hop’). And of course, maybe there just isn’t a better option than what we’re already doing.
Personally—I’m trying to start regularly calling my 2 brothers. They don’t formally study rationality but they care about it and are pretty smart. The family connection kinda makes up for the long distance and small group size, but it’s still not easy to get it going. I’d like to try to get a close-knit group of friends where I live, though they probably won’t be rationalists. But I’ll probably need to stop doing prediction markets to have the time to invest for that.
Oh, and what you said about the 5 stages makes a lot of sense—my timing is probably just not lined up with others, and maybe in a few years someone else will ask this and I’ll feel like “well I’m not surprised by what rationalists are accomplishing—I updated my model years ago”.
I read Alexander Scott say that peddling ‘woo’ might just be the side effect of a group taking self-improvement seriously and lacking the ability to fund actual studies and I think that hypothesis makes sense.
My hypothesis for the relation between self-improvement and woo is that people suck at holding two perspectives that seems to be pointing in the opposite direction, long enough to figure out a synthesis.
Let me give you an example: historically, people dreamed about flying. There are two simple ways how to respond to this desire:
giving up. Gravity is a law of nature. The birds have wings, the humans have not. End of conversation. Everyone who cannot suppress their desire to fly is an idiot, let’s laugh at them!
wishful thinking. I am sure that if I pray hard enough and purge my mind of negative thoughts, I will be able to spread my arms and fly. La-la-la, I can’t hear your skepticism!
The correct solution, as we know now, is to accept gravity as a fact, and then explore the other laws of nature until we find a force that can overcome gravity. There are even multiple solutions—balloons, gliding, reactive motors—but all of them require doing something complicated.
The difficulty is not that gravity is fundamentally incompatible with flying, but that both require contradictory emotions. You can feel the inescapable pressure of the universal law of gravity… or you can feel lightness and imagine flying… but it is difficult to feel both at the same time. Human thinking is just a thin layer on top of a fundamentally emotional machine, people usually get addicted to one emotion or the other, and then they become unable to consider the other part of the picture.
Similar pattern: effective altruism. People feel sad about bad things happening in the world and our inability to address them efficiently. The simple solutions:
grow up and accept the wisdom that the world cannot change. This can be simple fatalism, or a clever economical theory about how feeding the Africans only makes them reproduce more.
pray harder, post touching pictures on social networks, meditate and send positive energy.
A correct solution: collect data and calculate, promote the actions with the greatest impact.
The emotional problem: “observing the reality and calculating the hard data” and “desire to change reality” are emotionally incompatible. People choose one emotion or the other, and get stuck with it.
And the self-improvement seems to follow the similar dichotomy:
accept that you can’t improve (skills, looks, money, relationships), become proud of this “wisdom”, laugh at people who try to achieve something and call them immature, make sure to collect data about all their failings and never mention any success
read your horoscope, practice positive thinking, read alternative news sources, harmonize your chakras, read inspiring success stories, join a pyramid scheme, sell homeopathy, be open to everything (except for things that you develop aversion to, such as computers or vaccines)
Again, competing emotions of “noticing that the world sucks” and “the feeling that more is possible”. Can you keep trying, when you know that the motivational literature is scam, the success stories are mostly lies, many scientific findings don’t replicate, and your own results are probably just placebo?
.
About your list of options:
If you want to move somewhere, it would be nice (if you have enough time and money) to check the rationalist communities outside the Bay Area, because that place just seems doomed—the social pressure to take drugs and meditate will be too strong, and even if you personally resist it, the vortex will keep pulling away parts of your social circle.
Maybe Scott Alexander knows about this more: a few years ago, he traveled across the world, visiting various ACX meetups. He may at least narrow down the list of interesting places.
Recruiting new rationalists seemed to me like the best option, a few years ago. I mean, if I was impressed with the Sequences, surely there must be other people who will feel the same way, if only they get the text. Maybe I should go to a local math college and give away a few free copies of HP:MoR! These days, I don’t believe it anymore. The internet is a small place, and the rationalist community has a decent online presence. Most nerds have already heard about us. If they don’t join, it’s probably because they are not interested. There may be an exception or two, but probably not enough to start a local meetup.
If you start a community with non-rationalists, the chance to change it to a community of rationalists seems zero to me. (One possible exception: You could start a community that teaches self-help, or something like that, and then gradually find potential rationalists among the students.)
“Raising the sanity waterline” was the old plan, and some CFAR materials are freely available online. But you probably shouldn’t do it alone, as it is a lot of work.
I think you could achieve a better “tribe” feeling online with a smaller dedicated group, meeting regularly, having video calls, and a private channel for asynchronous communication. (Or maybe try The Guild of the Rose? I don’t know much about them, though.)
Regularly calling two or three people still sounds preferable to doing it alone. Maybe you could try to meet more people, and hope to find someone rationality-compatible.
Very well said. I also think more is possible—not nearly as much more as I originally thought, but there is always room for improvement and I do think there’s a real possibility that community effects can be huge. I mean, individual humans are smarter than individual animals, but the real advantages have accrued though society, specialization, teamwork, passing on knowledge, and sharing technology—all communal activities.
And yeah, probably the main barrier boils down to the things you mentioned. People who are interested in self-improvement and truth are a small subset of the population[1]. Across the country/world there are lots of them, but humans have some psychological thing about meeting face to face, and the local density most places is below critical mass. And having people move to be closer together would be a big ask even if they were already great friends, which the physical distance makes difficult. As far as I can see, the possible options are:
1. Move into proximity (very costly)
2. Start a community with the very few nearby rationalists (difficult to keep any momentum)
3. Start a community with nearby non-rationalists (could be socially rewarding, but likely to dampen any rationality advantage)
4. Teach people nearby to be rational (ideal, but very difficult)
5. Build an online community (LW is doing this. Could try video meetings, but I predict it would still feel less connected that in person and make momentum difficult)
5b. Try to change your psychology so that online feels like in person. (Also, difficult)
6. Do it without a community (The default, but limited effectiveness)
So, I don’t know—maybe when AR gets really good we could all hang out in the “metaverse” and it will feel like hanging out in person. Maybe even then it won’t—maybe it’s just literally having so many other options that makes the internet feel impersonal. If so, weird idea—have LW assign splinter groups and that’s the only group you get (maybe you can move groups, but there’s some waiting period so you can’t ‘hop’). And of course, maybe there just isn’t a better option than what we’re already doing.
Personally—I’m trying to start regularly calling my 2 brothers. They don’t formally study rationality but they care about it and are pretty smart. The family connection kinda makes up for the long distance and small group size, but it’s still not easy to get it going. I’d like to try to get a close-knit group of friends where I live, though they probably won’t be rationalists. But I’ll probably need to stop doing prediction markets to have the time to invest for that.
Oh, and what you said about the 5 stages makes a lot of sense—my timing is probably just not lined up with others, and maybe in a few years someone else will ask this and I’ll feel like “well I’m not surprised by what rationalists are accomplishing—I updated my model years ago”.
I read Alexander Scott say that peddling ‘woo’ might just be the side effect of a group taking self-improvement seriously and lacking the ability to fund actual studies and I think that hypothesis makes sense.
My hypothesis for the relation between self-improvement and woo is that people suck at holding two perspectives that seems to be pointing in the opposite direction, long enough to figure out a synthesis.
Let me give you an example: historically, people dreamed about flying. There are two simple ways how to respond to this desire:
giving up. Gravity is a law of nature. The birds have wings, the humans have not. End of conversation. Everyone who cannot suppress their desire to fly is an idiot, let’s laugh at them!
wishful thinking. I am sure that if I pray hard enough and purge my mind of negative thoughts, I will be able to spread my arms and fly. La-la-la, I can’t hear your skepticism!
The correct solution, as we know now, is to accept gravity as a fact, and then explore the other laws of nature until we find a force that can overcome gravity. There are even multiple solutions—balloons, gliding, reactive motors—but all of them require doing something complicated.
The difficulty is not that gravity is fundamentally incompatible with flying, but that both require contradictory emotions. You can feel the inescapable pressure of the universal law of gravity… or you can feel lightness and imagine flying… but it is difficult to feel both at the same time. Human thinking is just a thin layer on top of a fundamentally emotional machine, people usually get addicted to one emotion or the other, and then they become unable to consider the other part of the picture.
Similar pattern: effective altruism. People feel sad about bad things happening in the world and our inability to address them efficiently. The simple solutions:
grow up and accept the wisdom that the world cannot change. This can be simple fatalism, or a clever economical theory about how feeding the Africans only makes them reproduce more.
pray harder, post touching pictures on social networks, meditate and send positive energy.
A correct solution: collect data and calculate, promote the actions with the greatest impact.
The emotional problem: “observing the reality and calculating the hard data” and “desire to change reality” are emotionally incompatible. People choose one emotion or the other, and get stuck with it.
And the self-improvement seems to follow the similar dichotomy:
accept that you can’t improve (skills, looks, money, relationships), become proud of this “wisdom”, laugh at people who try to achieve something and call them immature, make sure to collect data about all their failings and never mention any success
read your horoscope, practice positive thinking, read alternative news sources, harmonize your chakras, read inspiring success stories, join a pyramid scheme, sell homeopathy, be open to everything (except for things that you develop aversion to, such as computers or vaccines)
Again, competing emotions of “noticing that the world sucks” and “the feeling that more is possible”. Can you keep trying, when you know that the motivational literature is scam, the success stories are mostly lies, many scientific findings don’t replicate, and your own results are probably just placebo?
.
About your list of options:
If you want to move somewhere, it would be nice (if you have enough time and money) to check the rationalist communities outside the Bay Area, because that place just seems doomed—the social pressure to take drugs and meditate will be too strong, and even if you personally resist it, the vortex will keep pulling away parts of your social circle.
Maybe Scott Alexander knows about this more: a few years ago, he traveled across the world, visiting various ACX meetups. He may at least narrow down the list of interesting places.
Recruiting new rationalists seemed to me like the best option, a few years ago. I mean, if I was impressed with the Sequences, surely there must be other people who will feel the same way, if only they get the text. Maybe I should go to a local math college and give away a few free copies of HP:MoR! These days, I don’t believe it anymore. The internet is a small place, and the rationalist community has a decent online presence. Most nerds have already heard about us. If they don’t join, it’s probably because they are not interested. There may be an exception or two, but probably not enough to start a local meetup.
If you start a community with non-rationalists, the chance to change it to a community of rationalists seems zero to me. (One possible exception: You could start a community that teaches self-help, or something like that, and then gradually find potential rationalists among the students.)
“Raising the sanity waterline” was the old plan, and some CFAR materials are freely available online. But you probably shouldn’t do it alone, as it is a lot of work.
I think you could achieve a better “tribe” feeling online with a smaller dedicated group, meeting regularly, having video calls, and a private channel for asynchronous communication. (Or maybe try The Guild of the Rose? I don’t know much about them, though.)
Regularly calling two or three people still sounds preferable to doing it alone. Maybe you could try to meet more people, and hope to find someone rationality-compatible.