Everyone, read also the comments at EA website, the top one makes a great point:
while EA/rationalism is not a cult, it contains enough ingredients of a cult that it’s relatively easy for someone to go off and make their own.
To avoid derailing the debate towards the definition of cult etc., let me paraphrase it as:
EA/rationalism is not an evil project, but it is relatively easy for someone to start an evil project by recruiting within the EA/rationalist ecosystem. (As opposed to starting an evil project somewhere else.)
This is how.
EA/rationalism in general [...] lacks enforced conformity and control by a leader. [...]
However, what seems to have happened is that multiple people have taken these base ingredients and just added in the conformity and charismatic leader parts. You put these ingredients in a small company or a group house, put an unethical or mentally unwell leader in charge, and you have everything you need for an abusive [...] environment. [...] This seems to have happened multiple times already.
I didn’t spend much time thinking about it, but I suspect that the lack of “conformity and control” in EA/rationalism may actually be a weakness, from this perspective. Whenever a bad actor starts doing something “in the name or EA/rationality”, there is no established mechanism to make public knowledge “no, you are not; this is your private project”. Especially when the bad actor does it at a LW meetup, or makes it publicly known that they donate money… then it would seem quite silly to argue that they are not “an EA/rationality project”.
(Compare it to other communities, which have more control and conformity, such as religion, where members clearly understand the difference between “this is just a personal opinion of someone who happens to be a Catholic” and “this is the official teaching of the Catholic church”.)
I’d suggest that’s what’s needed is a strong immune system against conformity and control which does not itself require conformity or control: a generalized resistance to agentic domination from another being, of any kind, and as part of it, a culture of habit around joining up with others who are skilled at resisting pressure in response to an attempted pressure, without that join up creating the problems it attempts to prevent. I’m a fan of this essay on this topic, as well as other writing from that group.
I don’t think the lack of a leadership is in itself the only issue here. I think the general framework is vulnerable to cultishness for a few more reasons. The first one is that in general it encourages dallying with unconventional ideas, and eschewing the notion that if it sounds crazy to the average person then it must be wrong. In fact, there’s plenty of “everyone else are the actually crazy ones” thinking which might be necessary insofar as you want to try becoming more rational, but also means you now have less checks on your overall beliefs and behaviour. The other, related, is the focus on utilitarianism as a philosophy, which creates even stronger conditions for specific beliefs like “doing crazy thing X is actually good as long as it’s justified by projected good outcomes”.
You mention religion but Catholicism is pretty much the only one with such a clear leadership. It helps but it doesn’t make it the only stable one. IMO with a strong leadership rationalism as a whole would just be more likely to become a cult.
I agree it is not the only issue. I think it is a combination of ideas being genuinely dangerous, and no one having the authority to declare: “you are using those ideas wrong”.
Plus the general “contrarian” attitude where the edgier opinions automatically give you higher status. So if someone hypothetically volunteered for the role of calling out wrong implementations of the dangerous ideas, they would probably be perceived as not smart/brave enough to appreciate them.
*
Thinking more about the analogies with religion… when people repeatedly propose something that is (in their perspective) an incorrect application of their ideas, they give it a name, declare it a heresy, and that makes it easier to deflect the problem in future by simply labeling it.
So perhaps the rationalist/EA alternative could be to maintain an online list of “ideas that are frequently attributed to, or associated with, rationalists / effective altruists, but we explicitly disapprove of them; here is a short summary why”. Probably with a shorter title, maybe “frequent bad ideas”. The next step would be to repeatedly tell new members about this list.
This has a potential to backfire, by making those bad ideas more visible. So we would be trading certainty of exposition against a probability of explosion. On the conservative side, we could simply describe the things that have already happened (the ideas that were already followed and it didn’t end well).
Thinking more about the analogies with religion… when people repeatedly propose something that is (in their perspective) an incorrect application of their ideas, they give it a name, declare it a heresy, and that makes it easier to deflect the problem in future by simply labeling it.
Again, you seem to be focused on Catholicism. Catholicism is not the norm. Orthodox Christianity is maybe also kinda like that, but Protestant denominations are not, Islam is not, most of Buddhism is not, Hinduism is not, and don’t even get me started on Shinto and other forms of animism. Most religions don’t have a fixed canon, they have at most a community which might decide to strongly shun (or even straight up violently punish) anyone whose beliefs are so aberrant they might as well not belong to the same religion any more. But the boundary itself isn’t well defined. If one wants to draw inspiration from religions (not that they are always the best example; Islam has given birth to plenty of violent and radical offshoots, for example, and exactly for the reason you bring up no one is quite in a position to call them out as wrong with unique authority), then you have to look at other things, at how they achieve collective cohesion even without central leadership, because central leadership is pretty much only the specific solution Catholics came up with.
Everyone, read also the comments at EA website, the top one makes a great point:
To avoid derailing the debate towards the definition of cult etc., let me paraphrase it as:
EA/rationalism is not an evil project, but it is relatively easy for someone to start an evil project by recruiting within the EA/rationalist ecosystem. (As opposed to starting an evil project somewhere else.)
This is how.
I didn’t spend much time thinking about it, but I suspect that the lack of “conformity and control” in EA/rationalism may actually be a weakness, from this perspective. Whenever a bad actor starts doing something “in the name or EA/rationality”, there is no established mechanism to make public knowledge “no, you are not; this is your private project”. Especially when the bad actor does it at a LW meetup, or makes it publicly known that they donate money… then it would seem quite silly to argue that they are not “an EA/rationality project”.
(Compare it to other communities, which have more control and conformity, such as religion, where members clearly understand the difference between “this is just a personal opinion of someone who happens to be a Catholic” and “this is the official teaching of the Catholic church”.)
I’d suggest that’s what’s needed is a strong immune system against conformity and control which does not itself require conformity or control: a generalized resistance to agentic domination from another being, of any kind, and as part of it, a culture of habit around joining up with others who are skilled at resisting pressure in response to an attempted pressure, without that join up creating the problems it attempts to prevent. I’m a fan of this essay on this topic, as well as other writing from that group.
I don’t think the lack of a leadership is in itself the only issue here. I think the general framework is vulnerable to cultishness for a few more reasons. The first one is that in general it encourages dallying with unconventional ideas, and eschewing the notion that if it sounds crazy to the average person then it must be wrong. In fact, there’s plenty of “everyone else are the actually crazy ones” thinking which might be necessary insofar as you want to try becoming more rational, but also means you now have less checks on your overall beliefs and behaviour. The other, related, is the focus on utilitarianism as a philosophy, which creates even stronger conditions for specific beliefs like “doing crazy thing X is actually good as long as it’s justified by projected good outcomes”.
You mention religion but Catholicism is pretty much the only one with such a clear leadership. It helps but it doesn’t make it the only stable one. IMO with a strong leadership rationalism as a whole would just be more likely to become a cult.
I agree it is not the only issue. I think it is a combination of ideas being genuinely dangerous, and no one having the authority to declare: “you are using those ideas wrong”.
Plus the general “contrarian” attitude where the edgier opinions automatically give you higher status. So if someone hypothetically volunteered for the role of calling out wrong implementations of the dangerous ideas, they would probably be perceived as not smart/brave enough to appreciate them.
*
Thinking more about the analogies with religion… when people repeatedly propose something that is (in their perspective) an incorrect application of their ideas, they give it a name, declare it a heresy, and that makes it easier to deflect the problem in future by simply labeling it.
So perhaps the rationalist/EA alternative could be to maintain an online list of “ideas that are frequently attributed to, or associated with, rationalists / effective altruists, but we explicitly disapprove of them; here is a short summary why”. Probably with a shorter title, maybe “frequent bad ideas”. The next step would be to repeatedly tell new members about this list.
This has a potential to backfire, by making those bad ideas more visible. So we would be trading certainty of exposition against a probability of explosion. On the conservative side, we could simply describe the things that have already happened (the ideas that were already followed and it didn’t end well).
Again, you seem to be focused on Catholicism. Catholicism is not the norm. Orthodox Christianity is maybe also kinda like that, but Protestant denominations are not, Islam is not, most of Buddhism is not, Hinduism is not, and don’t even get me started on Shinto and other forms of animism. Most religions don’t have a fixed canon, they have at most a community which might decide to strongly shun (or even straight up violently punish) anyone whose beliefs are so aberrant they might as well not belong to the same religion any more. But the boundary itself isn’t well defined. If one wants to draw inspiration from religions (not that they are always the best example; Islam has given birth to plenty of violent and radical offshoots, for example, and exactly for the reason you bring up no one is quite in a position to call them out as wrong with unique authority), then you have to look at other things, at how they achieve collective cohesion even without central leadership, because central leadership is pretty much only the specific solution Catholics came up with.