I believe that there are four main strategies groups use to respond to accusations of being a cult: defense, offense, assimilation, and acceptance. This article is representative of the typical rationalist strategy of mounting an intellectual defense. Religious and ethnic groups tend to assimilate or play offense. Nonthreatening but unusual hobbyist, spiritual, or community groups often jokingly pretend to accept, or at least entertain, the accusation.
It seems to me that a defensive strategy is often what genuinely threatening groups often use to justify their existence. They may claim that technically, the group doesn’t fit the characteristics of a cult, or that technically some other group that is an accepted part of society does fit the characteristics of a cult, and therefore that society, by its own standards, must accept the group. But this is false! Society doesn’t have to use this standard, and often doesn’t. It comes across as trying to undermine the often unspoken norms of the society, make them legible so the group can manipulate them in order to achieve its ends. That feels threatening. Multi-level marketing schemes try to frame themselves as technically legitimate businesses. Right-wing extremists try to frame themselves as legitimate well-organized militias. White nationalists and neo-Nazis try to frame themselves as ordinary ethnic and cultural groups. And rationalists try to frame themselves as ordinary community associations. This pattern-matching, I claim, is part of why rationalists are so vulnerable to cult accusations.
Compounding this is what rationalists do take offense at. Rationalists don’t take offense at being called cults. They did take offense at having the real name of one of their community leaders revealed. That is very much the sort of thing a secret society or cult might do. If rationalists took loud offense at being insulted by “cult” accusations while also making moves to show up in wider society as well-assimilated, upstanding members of it, then over time, that would tamp down on the cult accusations. But rationalists very much don’t seem to be assimilating. They incorporate seemingly every weird strain of Silicon Valley culture and don’t seem to do almost anything the “normal way,” from dress, to food, to housing, to relationships, to art, to conversation, to work, to politics.
So no assimilation, not taking offense at being called cults, not accepting the cult accusation lightheartedly, just mounting intellectual defenses against meeting some self-chosen technical criteria of what it means to be a “cult.” Sounds like the kind of thing a cult might do, in order to attract more of the kind of people who buy into this way of thinking and isolate them from the rest of society through an indoctrination into their weird culture!
Having some affinity for rationalism, I think I understand why rationalists do this. They’ve had a history of, inconsistently, rejecting what they call the Dark Arts and “asymmetric weapons,” and trying to stick with a facts-and-logic approach to every debate. Focusing on a strategy based on intellectual defense is in line with that. It suits the self-chosen morality of rationalists. But sticking to that so consistently is, again, a refusal to assimilate. So I think that as long as rationalists persist in this strategy, so long will they be perceived as a cult.
This sounds like damned if you do, damned if you don’t. If someone accuses you of being a cult, saying that you are not is exactly what a cult would do. But it’s not like saying nothing will clear the accusation either.
Actually, saying that you are not a cult is not what an actual cult would do. An actual cult would attack the accuser (like physically, not like “tweet a disagreement”), and redirect the debate to “freedom of religion”.
Is anyone defending the rationality community by some kind of argument that pattern-matches “we are the good/holy guys, therefore criticizing us for anything (whether truthfully or not) is inherently evil”? I am not aware of anything like this. Are people taking personal revenge against David Gerard or Cade Metz (other than complaining online about their behavior)? Again, I am unaware of that.
Less importantly, I don’t believe that “accepting the cult accusation lightheartedly” would work. I am not even sure it ever worked for anyone. Do you have some good examples?
I articulate a fake framework for thinking about strategies by which groups can fight for status as a legitimate culture-in-good-standing. This fight is a long-term struggle, not a guaranteed victory or defeat. Groups are always damned by many, but can aim to be damned by fewer over the long run by being organized and strategic about how they respond to criticism.
Core to this strategy is taking offense at being called a cult while assimilating into the broader culture in ways that do not sacrifice your core values but are crucial issues for others.
For example, to assimilate, Mormons dropped polygamy and repudiated violent behavior of people who claim to be Mormons in the strongest terms, now frame themselves as Christians or the Church of Latter-day Saints (or LDS) rather than the more loaded “Mormon.” In fact, the Mormon at my PhD program just refers to “his church” without even mentioning LDS. They also take offense with public media for misrepresenting their faith or their current stance on the problems in the religion’s past, which they acknowledge and condemn. They don’t try to explain why they’re different from a cult according to a list of abstract criteria, the way OP is doing here.
Being lighthearted is not a suitable strategy for a group that is being subjected to serious sustained cult accusations. It’s a strategy for small odd and harmless groups to acknowledge their difference from the broader culture, creating space for a person who feels a momentary question or discomfort to bring it up, and then move on. It’s what to do for small new-age meditative communities, the Odd Fellows, etc.
If rationalists wanted to adopt a Mormon-like strategy, they’d need to define what are the core values of a rationalist. They’d need a formal leadership that can interact with media and pre-emptivey sanction subgroups and individuals who give rationalism a bad name. They’d need to make visible demonstrations of assimilation. And they’d need to demand respect and take offense at disrespect, insisting on being viewed as a positive cultural and community association. That’s very far from rationalist trends, and I don’t expect it will happen. But I do think it’s the strategy that every long-lived religion and major cultural group lands on.
Core to this strategy is taking offense at being called a cult while assimilating into the broader culture in ways that do not sacrifice your core values but are crucial issues for others.
I like this summary!
Just a quick idea, I think a nice way to contact the outside culture without compromising on anything could be to organize public lectures or workshops. Kinda like the existing workshops for the mathematically gifted kids or wannabe bloggers, only they should be short (like, one afternoon) and instead of inviting people to our “walled compound” they should be on a neutral territory that feels safe (maybe even offer to organize the workshop in a local school). Possible topics: mathematics, statistics, computer science, learning in general, critical thinking, blogging. Too bad I am not in America, I would quite enjoy doing something like that, and I am not doing anything important that this would distract me from. This could help create public perception of our community as “harmless nerds”.
Ironically, one thing that might help would be to somehow make the membership explicit. Without explicit membership, you cannot exclude people (such as Zizians or SBF), so people can argue that they belong to us, and there is no way to prove the opposite. Mormons can say “this is not one of us” when he is not, and they can kick someone problematic out when he is. Or maybe rationality is too nebulous word, so we could instead talk about e.g. “Less Wrong community membership”. It’s like the orthogonality thesis: whether someone is good at Bayesian updating, and whether someone is a decent person, are two independent things—we should try to find the people in the intersection.
I wonder (but this would be a longer debate) if we could have some kind of “web of trust”, where individual rationalists could specify how much they trust someone to be a nice and reasonable person; the system would calculate the score in some way, and you need to exceed some threshold to be accepted. If you turn out to be a scammer or a serial killer, everyone who vouched for you would be punished in some way (lose their right to vouch for someone, get a penalty on their own score). No idea how specifically should the math work here.
These are good ideas. I like the idea of offering tutoring or classes as a way to engage a broader community. I also think having formal orgs that interface with media and have official leaders who speak on behalf of their membership seems like a good idea. However, to work, I think these orgs are going to have to officially put the brakes on some of the divergent lifestyle choices of membership and on some of the more radical statements by rationalist figures, and it may not be compatible with the culture of rationalists to submit to constraining, assimilative norms in that way.
The web of trust is also something I’ve wanted for the world of science. The way I picture it is that you need a way to subscribe to other people or organizations whose judgments you trust. Each participant can privately rate their trust level in other participants. The trust level they observe reflects the aggregate trust levels of the participants they subscribe to. Would love to see such a technology.
They may claim that technically, the group doesn’t fit the characteristics of a cult, or that technically some other group that is an accepted part of society does fit the characteristics of a cult, and therefore that society, by its own standards, must accept the group. But this is false! Society doesn’t have to use this standard, and often doesn’t. It comes across as trying to undermine the often unspoken norms of the society, make them legible so the group can manipulate them in order to achieve its ends. That feels threatening.
I think this is both an important point and proves too much. I’m interested in a discussion harms-of-cults or (not of cults) rationalism might be producing, but I can’t get invested in the opinion of a guy who thinks a three story building is inherently ominous.
That’s an entirely valid interest. One point I want to make is that when people spin a narrative accusing a group of being a cult, the points of evidence they raise are often post-hoc rationalizations rather than true reports on how they concluded the group was a cult. In this case, I doubt the guy truly finds the building is ominous, any more than I find pictures of the art in Jeffrey Epstein’s mansion ominous. They take on an ominous feeing because of prior assumptions about the group that are being mapped on to things associated with it.
This matters because it’s important to accurately distinguish the real evidence someone’s drawing on to form their conclusion that a group is a cult from the narrative they’re spinning to make an accusation or insult. In this case, the quip about the building is the insult, not the evidence. It’s still fine to not take the insult seriously. But usually we evaluate people’s evidence and reasoning process to decide whether to take them seriously. In this case, we’d want to know how the guy truly first came to the conclusion that rationalism was a cult. Maybe he heard a bunch of stories about the Zizians and was exposed to some salacious Metz quotes of Scott’s writing and Tyler Cowen’s quotes of Eliezer. Maybe he heard from his friend they’re a cult and saw some weirdly dressed people going to a party and heard about Aella doing sex work to pay for IVF. Or maybe he applied the cult criteria differently than OP here and decided the rationalists fit the bill. Some reasons may be good, others bad, but in any case, this is where I’d look to decide whether to take him seriously. If I just wanted to defend rationalism I’d say “this guy is a bigot, a rationalist-phobe, and a hypocrite. He should be ashamed of himself. Would he talk this way about a religious community or an ethnic group or a hobby community? No? Then what makes it OK to talk about us this way? It’s unacceptable.” Etc.
I believe that there are four main strategies groups use to respond to accusations of being a cult: defense, offense, assimilation, and acceptance. This article is representative of the typical rationalist strategy of mounting an intellectual defense. Religious and ethnic groups tend to assimilate or play offense. Nonthreatening but unusual hobbyist, spiritual, or community groups often jokingly pretend to accept, or at least entertain, the accusation.
It seems to me that a defensive strategy is often what genuinely threatening groups often use to justify their existence. They may claim that technically, the group doesn’t fit the characteristics of a cult, or that technically some other group that is an accepted part of society does fit the characteristics of a cult, and therefore that society, by its own standards, must accept the group. But this is false! Society doesn’t have to use this standard, and often doesn’t. It comes across as trying to undermine the often unspoken norms of the society, make them legible so the group can manipulate them in order to achieve its ends. That feels threatening. Multi-level marketing schemes try to frame themselves as technically legitimate businesses. Right-wing extremists try to frame themselves as legitimate well-organized militias. White nationalists and neo-Nazis try to frame themselves as ordinary ethnic and cultural groups. And rationalists try to frame themselves as ordinary community associations. This pattern-matching, I claim, is part of why rationalists are so vulnerable to cult accusations.
Compounding this is what rationalists do take offense at. Rationalists don’t take offense at being called cults. They did take offense at having the real name of one of their community leaders revealed. That is very much the sort of thing a secret society or cult might do. If rationalists took loud offense at being insulted by “cult” accusations while also making moves to show up in wider society as well-assimilated, upstanding members of it, then over time, that would tamp down on the cult accusations. But rationalists very much don’t seem to be assimilating. They incorporate seemingly every weird strain of Silicon Valley culture and don’t seem to do almost anything the “normal way,” from dress, to food, to housing, to relationships, to art, to conversation, to work, to politics.
So no assimilation, not taking offense at being called cults, not accepting the cult accusation lightheartedly, just mounting intellectual defenses against meeting some self-chosen technical criteria of what it means to be a “cult.” Sounds like the kind of thing a cult might do, in order to attract more of the kind of people who buy into this way of thinking and isolate them from the rest of society through an indoctrination into their weird culture!
Having some affinity for rationalism, I think I understand why rationalists do this. They’ve had a history of, inconsistently, rejecting what they call the Dark Arts and “asymmetric weapons,” and trying to stick with a facts-and-logic approach to every debate. Focusing on a strategy based on intellectual defense is in line with that. It suits the self-chosen morality of rationalists. But sticking to that so consistently is, again, a refusal to assimilate. So I think that as long as rationalists persist in this strategy, so long will they be perceived as a cult.
This sounds like damned if you do, damned if you don’t. If someone accuses you of being a cult, saying that you are not is exactly what a cult would do. But it’s not like saying nothing will clear the accusation either.
Actually, saying that you are not a cult is not what an actual cult would do. An actual cult would attack the accuser (like physically, not like “tweet a disagreement”), and redirect the debate to “freedom of religion”.
Is anyone defending the rationality community by some kind of argument that pattern-matches “we are the good/holy guys, therefore criticizing us for anything (whether truthfully or not) is inherently evil”? I am not aware of anything like this. Are people taking personal revenge against David Gerard or Cade Metz (other than complaining online about their behavior)? Again, I am unaware of that.
Less importantly, I don’t believe that “accepting the cult accusation lightheartedly” would work. I am not even sure it ever worked for anyone. Do you have some good examples?
I articulate a fake framework for thinking about strategies by which groups can fight for status as a legitimate culture-in-good-standing. This fight is a long-term struggle, not a guaranteed victory or defeat. Groups are always damned by many, but can aim to be damned by fewer over the long run by being organized and strategic about how they respond to criticism.
Core to this strategy is taking offense at being called a cult while assimilating into the broader culture in ways that do not sacrifice your core values but are crucial issues for others.
For example, to assimilate, Mormons dropped polygamy and repudiated violent behavior of people who claim to be Mormons in the strongest terms, now frame themselves as Christians or the Church of Latter-day Saints (or LDS) rather than the more loaded “Mormon.” In fact, the Mormon at my PhD program just refers to “his church” without even mentioning LDS. They also take offense with public media for misrepresenting their faith or their current stance on the problems in the religion’s past, which they acknowledge and condemn. They don’t try to explain why they’re different from a cult according to a list of abstract criteria, the way OP is doing here.
Being lighthearted is not a suitable strategy for a group that is being subjected to serious sustained cult accusations. It’s a strategy for small odd and harmless groups to acknowledge their difference from the broader culture, creating space for a person who feels a momentary question or discomfort to bring it up, and then move on. It’s what to do for small new-age meditative communities, the Odd Fellows, etc.
If rationalists wanted to adopt a Mormon-like strategy, they’d need to define what are the core values of a rationalist. They’d need a formal leadership that can interact with media and pre-emptivey sanction subgroups and individuals who give rationalism a bad name. They’d need to make visible demonstrations of assimilation. And they’d need to demand respect and take offense at disrespect, insisting on being viewed as a positive cultural and community association. That’s very far from rationalist trends, and I don’t expect it will happen. But I do think it’s the strategy that every long-lived religion and major cultural group lands on.
I like this summary!
Just a quick idea, I think a nice way to contact the outside culture without compromising on anything could be to organize public lectures or workshops. Kinda like the existing workshops for the mathematically gifted kids or wannabe bloggers, only they should be short (like, one afternoon) and instead of inviting people to our “walled compound” they should be on a neutral territory that feels safe (maybe even offer to organize the workshop in a local school). Possible topics: mathematics, statistics, computer science, learning in general, critical thinking, blogging. Too bad I am not in America, I would quite enjoy doing something like that, and I am not doing anything important that this would distract me from. This could help create public perception of our community as “harmless nerds”.
Ironically, one thing that might help would be to somehow make the membership explicit. Without explicit membership, you cannot exclude people (such as Zizians or SBF), so people can argue that they belong to us, and there is no way to prove the opposite. Mormons can say “this is not one of us” when he is not, and they can kick someone problematic out when he is. Or maybe rationality is too nebulous word, so we could instead talk about e.g. “Less Wrong community membership”. It’s like the orthogonality thesis: whether someone is good at Bayesian updating, and whether someone is a decent person, are two independent things—we should try to find the people in the intersection.
I wonder (but this would be a longer debate) if we could have some kind of “web of trust”, where individual rationalists could specify how much they trust someone to be a nice and reasonable person; the system would calculate the score in some way, and you need to exceed some threshold to be accepted. If you turn out to be a scammer or a serial killer, everyone who vouched for you would be punished in some way (lose their right to vouch for someone, get a penalty on their own score). No idea how specifically should the math work here.
These are good ideas. I like the idea of offering tutoring or classes as a way to engage a broader community. I also think having formal orgs that interface with media and have official leaders who speak on behalf of their membership seems like a good idea. However, to work, I think these orgs are going to have to officially put the brakes on some of the divergent lifestyle choices of membership and on some of the more radical statements by rationalist figures, and it may not be compatible with the culture of rationalists to submit to constraining, assimilative norms in that way.
The web of trust is also something I’ve wanted for the world of science. The way I picture it is that you need a way to subscribe to other people or organizations whose judgments you trust. Each participant can privately rate their trust level in other participants. The trust level they observe reflects the aggregate trust levels of the participants they subscribe to. Would love to see such a technology.
I think this is both an important point and proves too much. I’m interested in a discussion harms-of-cults or (not of cults) rationalism might be producing, but I can’t get invested in the opinion of a guy who thinks a three story building is inherently ominous.
That’s an entirely valid interest. One point I want to make is that when people spin a narrative accusing a group of being a cult, the points of evidence they raise are often post-hoc rationalizations rather than true reports on how they concluded the group was a cult. In this case, I doubt the guy truly finds the building is ominous, any more than I find pictures of the art in Jeffrey Epstein’s mansion ominous. They take on an ominous feeing because of prior assumptions about the group that are being mapped on to things associated with it.
This matters because it’s important to accurately distinguish the real evidence someone’s drawing on to form their conclusion that a group is a cult from the narrative they’re spinning to make an accusation or insult. In this case, the quip about the building is the insult, not the evidence. It’s still fine to not take the insult seriously. But usually we evaluate people’s evidence and reasoning process to decide whether to take them seriously. In this case, we’d want to know how the guy truly first came to the conclusion that rationalism was a cult. Maybe he heard a bunch of stories about the Zizians and was exposed to some salacious Metz quotes of Scott’s writing and Tyler Cowen’s quotes of Eliezer. Maybe he heard from his friend they’re a cult and saw some weirdly dressed people going to a party and heard about Aella doing sex work to pay for IVF. Or maybe he applied the cult criteria differently than OP here and decided the rationalists fit the bill. Some reasons may be good, others bad, but in any case, this is where I’d look to decide whether to take him seriously. If I just wanted to defend rationalism I’d say “this guy is a bigot, a rationalist-phobe, and a hypocrite. He should be ashamed of himself. Would he talk this way about a religious community or an ethnic group or a hobby community? No? Then what makes it OK to talk about us this way? It’s unacceptable.” Etc.