People sometimes say that cult members tend to have conflicts that lead to them joining the cult. Recently I’ve been wondering if this is an underrated aspect of cultishness.
Let’s take the LaSota crew as an example. As I understand, they were militant vegans.
And if I understand correctly, vegans are concerned about the dynamic where, in order to obtain animal flesh to eat, people usually hire people to raise lots of animals in tiny indoor spaces, sometimes cutting off body parts such as beaks if they are too irritable. Never letting them out to live freely. Taking their children away shortly after they give birth. Breeding them to grow rapidly but also often having genetic disorders that cause a lot of pain and disability. And basically just having them live like that until they get killed and butchered.
And from what I understand, society often tries to obscure this. People get uncomfortable and try to change the subject when you talk about it. People might make laws to make it hard to film and share what’s going on. People come up with convoluted denials of animals having feelings. And so on.
(I am not super certain about the above two paragraphs because I haven’t investigated it myself, just picked it up by osmosis and haven’t seen any serious objections to this narrative.)
I don’t know much of the LaSota crew’s backstory (I think there were other conflicts than this too? would have to re-read the pages), but basically I wonder if they were vegans, noticed how society (including some rationalists?) conspire to cover up these things, and how Ziz was one of the main people taking it seriously.
You can also sort of take a Bayesian view of it. Like one way to apply Bayes is that you have a set of “experts” who you have varying levels of trust in, and then as they say things, you use Bayesian updates to reallocate trust from those who make bad predictions to those who make good predictions. If you have a conflict where one side is consistently dishonest, then this procedure can lead to rapidly transfering a ton of their trust away, to the other side of the conflict.
If you have a conflict where one side is consistently dishonest, then this procedure can lead to rapidly transfering a ton of their trust away, to the other side of the conflict.
This mostly makes sense, but the wrong part of it is the “homogeneity of the outgroup” assumption. Basically, the cult leader does the trick of dividing the world into two groups: the cult, and everyone else.
The cult tells the truth about that one thing you strongly care about? Check.
All people who lie about the thing you care about are in the “everyone else” group? Check.
The missing part is that… many people in the “everyone else” group also tell the truth about that one thing you strongly care about. That’s because the “everyone else” group literally contains billions of people, with all kinds of opinions and behaviors.
But it is easy to miss this, especially when the cult leader tends to use the liars as the prototypes of the outgroup (essentially “weakmanning” the rest of humanity).
As a specific example, if you strongly care about veganism, you should notice that although the majority of non-Zizians are non-vegans, the majority of vegans are non-Zizians. So you shouldn’t conclude that there is no salvation outside of Zizians.
But also, it doesn’t always work. You might have multiple conflicts going on, exponentially reducing who fits the bill. Most people don’t know who you are, so you might be limited to your local circles. Sometimes the conflict itself is an obscure thing that few can interact with.
On its own, yes there are lots of vegans they could have had contact with. But the Zizians were also rat/EA types, which restricts their community reach heavily. Though there are lots of peaceful EA vegans, so this can’t explain it all.
But like—could there be any other conflicts they had? I expect there to be, though I am not sure about the details. Maybe I am wrong.
People sometimes say that cult members tend to have conflicts that lead to them joining the cult. Recently I’ve been wondering if this is an underrated aspect of cultishness.
Let’s take the LaSota crew as an example. As I understand, they were militant vegans.
And if I understand correctly, vegans are concerned about the dynamic where, in order to obtain animal flesh to eat, people usually hire people to raise lots of animals in tiny indoor spaces, sometimes cutting off body parts such as beaks if they are too irritable. Never letting them out to live freely. Taking their children away shortly after they give birth. Breeding them to grow rapidly but also often having genetic disorders that cause a lot of pain and disability. And basically just having them live like that until they get killed and butchered.
And from what I understand, society often tries to obscure this. People get uncomfortable and try to change the subject when you talk about it. People might make laws to make it hard to film and share what’s going on. People come up with convoluted denials of animals having feelings. And so on.
(I am not super certain about the above two paragraphs because I haven’t investigated it myself, just picked it up by osmosis and haven’t seen any serious objections to this narrative.)
I don’t know much of the LaSota crew’s backstory (I think there were other conflicts than this too? would have to re-read the pages), but basically I wonder if they were vegans, noticed how society (including some rationalists?) conspire to cover up these things, and how Ziz was one of the main people taking it seriously.
You can also sort of take a Bayesian view of it. Like one way to apply Bayes is that you have a set of “experts” who you have varying levels of trust in, and then as they say things, you use Bayesian updates to reallocate trust from those who make bad predictions to those who make good predictions. If you have a conflict where one side is consistently dishonest, then this procedure can lead to rapidly transfering a ton of their trust away, to the other side of the conflict.
This mostly makes sense, but the wrong part of it is the “homogeneity of the outgroup” assumption. Basically, the cult leader does the trick of dividing the world into two groups: the cult, and everyone else.
The cult tells the truth about that one thing you strongly care about? Check.
All people who lie about the thing you care about are in the “everyone else” group? Check.
The missing part is that… many people in the “everyone else” group also tell the truth about that one thing you strongly care about. That’s because the “everyone else” group literally contains billions of people, with all kinds of opinions and behaviors.
But it is easy to miss this, especially when the cult leader tends to use the liars as the prototypes of the outgroup (essentially “weakmanning” the rest of humanity).
As a specific example, if you strongly care about veganism, you should notice that although the majority of non-Zizians are non-vegans, the majority of vegans are non-Zizians. So you shouldn’t conclude that there is no salvation outside of Zizians.
To an extent, yes this is a good solution.
But also, it doesn’t always work. You might have multiple conflicts going on, exponentially reducing who fits the bill. Most people don’t know who you are, so you might be limited to your local circles. Sometimes the conflict itself is an obscure thing that few can interact with.
On its own, yes there are lots of vegans they could have had contact with. But the Zizians were also rat/EA types, which restricts their community reach heavily. Though there are lots of peaceful EA vegans, so this can’t explain it all.
But like—could there be any other conflicts they had? I expect there to be, though I am not sure about the details. Maybe I am wrong.
That sounds correct. Rat, vegan, trans… maybe one or two more things, and the selection is sufficiently narrow.