An insight that I’d kind of already had, but which this interview with Michael Taft (relevant section starts at about 32 minutes) helped crystallize:
We tend to think of a “cult leader” as someone who intentionally sets out to create a cult. But most cult-like things probably don’t form like that. A lot of people feel a strong innate desire to be in a cult.
In the podcast, Taft suggests that it’s rooted in an infant’s need to attach to a caregiver, and to treat them as a fully dependable authority to fix all problems – a desire which doesn’t necessarily ever go fully away. Once someone becomes a teacher of some sort, even if they had absolutely no desire to create a cult, they will regardless attract people who want to be their cultists.
There are people who want to find a fully dependable authority figure to look up to, and are just looking for someone who feels like a good fit for the role. (I should note that I have definitely not been immune to feeling this yearning myself.) To avoid having cultists, “not intending to create a cult” isn’t enough; you have to actively fight against people’s tendency to idealize you, by doing things that force them to confront the fact that you are actually just a human.
I’m reminded of something I recall Eliezer Yudkowsky once saying: “if you tell your doting followers not to form a cult, they will go around saying ‘We Must Not Form A Cult, Great Leader Mundo Said So’.”
Once people do start pulling you towards a cult leader role, it’s going to feel very appealing. What it feels like from the inside is “all of these people like me and say that I’ve done a lot of good for them, so clearly I must be doing things right, and since they also listen to me, I can use my position to help them out even more”.
It’s not just that the cultists are getting “brainwashed” by their leader; it’s also that the leader is getting brainwashed by their cultists to take the role that they want the leader to take. Cults are said to use “love bombing” to attract new recruits, but in at least some cases, it probably also happens that the cult leader is getting love bombed by their followers.
And the temptation to take on that role is powerful not only because it feels nice personally, but also because it does allow you to use your power for good. One definition for a hypnotic trance that I’ve heard is that it’s a state in which a person’s critical faculty is bypassed, which allows the hypnotist to directly make changes in the mind of the person being hypnotized. And you can do a lot of good that way, such as by implanting suggestions that help people overcome their addictions or phobias.
Being someone’s cultist (in this sense) is kind of like them having you in a hypnotic trance. It is possible for to use that power in a way that’s beneficial, because the critical faculty that might normally reject or modulate the leader’s suggestions gets partially bypassed.
But that same power makes it extremely dangerous, since people are not going to think critically about what you say, and may take your words far more literally than you intended, when you didn’t think of adding the obvious-to-you caveats about how it shouldn’t be interpreted.
I’ve been feeling this myself. I’ve written various things that people like. And I’ve been having a definite sense of some of my social environment trying to tug me more towards a role as a teacher and as an authority, getting the sense that some people are idealizing me. (And again, yes, there have been several times when I’ve had the cult follower energy myself, too – both towards online writers and in some of my romantic relationships.)
I’m reminded here again of Valentine’s essay on the “Intelligent Social Web” and of how people tend to take the kinds of roles that their social environment recognizes and rewards… and how people try to tug others into the kinds of roles that they can recognize and know how to interact with, and the collective power of everyone doing this causes the social web as a whole to try to pull people into recognizable roles – including the role of “charismatic leader”.
Here we come back to Taft’s suggestion that many people have an instinctive desire to get someone into a role that they recognize as a “trustworthy caretaker” one, because the “child” role is one that feels very easy to play – just surrender your judgment to the other person and do everything the way (you think that) they want you to.
And I’m also reminded of siderea’s analysis of kingship in Watership Down, and of how Hazel never thought of himself as a leader originally in the novel, until the characters around him started treating him as one – and how that might not be as good of a deal as our society makes “kingship” sound like:
If you demonstrate a concern for the wellbeing of the people in your people, they will start seeing their wellbeing as your concern. Start taking responsibility for how things go in a group, and people will start seeing you as responsible for how things go in a group.
This, right here, is what causes many people to back away from Kingship. Which is their right, of course. It’s totally legitimate to look at that deal and say, “Oh, hell no.”
Our society tells us that being King is awesome and everyone – well, everyone normal – wants to be one. “Every body wants to rule the world.” No, actually, they don’t. My experience tells me that most people are very reluctant to step into the job of King, and this consequence of the role is a primary reason why.
I don’t know, but it strikes me at least plausible that prospective leaders themselves getting partially deluded about what it is that they are up for, is what enables them to actually step into the role rather than just saying “oh hell no”.
I’ve referenced this post, or at least this concept, in the past year. I think it’s fairly important. I’ve definitely seen this dynamic. I’ve felt it as a participant who totally wants a responsible authority figure to look up to and follow, and I’ve seen in how people respond to various teacher-figures in the rationalsphere.
I think the rationalsphere lucked out in its founding members being pretty wise, and going out of their way to try to ameliorate a lot of the effects here, and still those people end up getting treated in a weird cult-leader-y way even when they’re trying not to. (I recall one community leader telling me, 8 years ago, “look I don’t know anything please don’t overupdate on what I say” and somehow that made them feel like they were even more wise and I was treating them like Yoda even more.)
My thoughts on it are somewhat connected to “In Defense of Attempting Hard Things” and discussion surrounding Leverage Research (which was triggered by a particular writeup by Zoe, but I think was more of a broader set of pent up frustrations). The rationalsphere/longtermist/EAcosystem are trying to do pretty hard things. Pretty hard things often require both commitment/dedication, and willingness to try weird strategies. This combination tends to produce cults or cult-adjacent things as a byproduct, which is worrisome and bad, but, man, it’s still important to actually try the hard things.
The “hard things / weirdness” → “cultishness” model is separate from the model in this post, but the fact that that (I think) Hard Weird Communities are important, makes the failure modes of the OP more costly.
This year I ran into a person who seemed to be accidentally attracting a cult around them, despite them seeming really innocuous in a lot of ways. I don’t think I directly referred them them to this post but having the concept handy was to talk to them was useful.