Now I feel like rationality itself is an infohazard. I mean, rationality itself won’t hurt you if you are sufficiently sane, but if you start talking about it, insufficiently sane people will listen, too. And that will have horrible consequences. (And when I try to find a way to navigate around this, such as talking openly only to certifiably sane people, that seems like the totally cultish thing to do.)
There is an alternative way, the other extreme: get more and more rationalists. If the formed communities do not share the moral inclinations of LW community, those might form some new coordination structures[1]; if we don’t draw from the circles of desperate, those structures will tend to benefit others as well (and, on the other hand, having a big proportion of very unsatisfied people would naturally start a gang or overthrow whatever institutions are around).
(It’s probably worth exploring in a separate post?)
I claim non-orthogonality between goals and means in this case. For some community with altruistic people, its structures require learning a fair bit about people’s values. For a group which wants tech companies to focus on consumers’ quality-of-life more than currently, not so.
From my experience, the rationality community in Vienna does not share any of the craziness in Bay Area that I read about, so yeah, it seems plausible that different communities will end up significantly different.
I think there is a strong founder effect… the new members will choose whether they join or not depending on how comfortable they feel among the existing members. Decisions like “we have these rules / we don’t have any rules”, “there are people responsible for organization and safety / everyone needs to take care of themselves” once established, easily become “the way this is done here”.
But you are also limited by the pool you are recruiting the potential new members from. Could be, there are simply not enough people to make a local rationality community. Could be, the local memes are so strong (e.g. positive attitude towards drug use, or wokeness) that in practice you cannot push against them without actively rejecting most of wannabe members, which would be a weird dynamic. (You already need to push strongly against people who simply do not get what rationality means, but are trying to join anyway.)
There is an alternative way, the other extreme: get more and more rationalists.
If the formed communities do not share the moral inclinations of LW community, those might form some new coordination structures[1]; if we don’t draw from the circles of desperate, those structures will tend to benefit others as well (and, on the other hand, having a big proportion of very unsatisfied people would naturally start a gang or overthrow whatever institutions are around).
(It’s probably worth exploring in a separate post?)
I claim non-orthogonality between goals and means in this case. For some community with altruistic people, its structures require learning a fair bit about people’s values. For a group which wants tech companies to focus on consumers’ quality-of-life more than currently, not so.
From my experience, the rationality community in Vienna does not share any of the craziness in Bay Area that I read about, so yeah, it seems plausible that different communities will end up significantly different.
I think there is a strong founder effect… the new members will choose whether they join or not depending on how comfortable they feel among the existing members. Decisions like “we have these rules / we don’t have any rules”, “there are people responsible for organization and safety / everyone needs to take care of themselves” once established, easily become “the way this is done here”.
But you are also limited by the pool you are recruiting the potential new members from. Could be, there are simply not enough people to make a local rationality community. Could be, the local memes are so strong (e.g. positive attitude towards drug use, or wokeness) that in practice you cannot push against them without actively rejecting most of wannabe members, which would be a weird dynamic. (You already need to push strongly against people who simply do not get what rationality means, but are trying to join anyway.)