Cultish Countercultishness

In the mod­ern world, join­ing a cult is prob­a­bly one of the worse things that can hap­pen to you. The best-case sce­nario is that you’ll end up in a group of sincere but de­luded peo­ple, mak­ing an hon­est mis­take but oth­er­wise well-be­haved, and you’ll spend a lot of time and money but end up with noth­ing to show. Ac­tu­ally, that could de­scribe any failed Sili­con Valley startup. Which is sup­posed to be a hell of a har­row­ing ex­pe­rience, come to think. So yes, very scary.

Real cults are vastly worse. “Love bomb­ing” as a re­cruit­ment tech­nique, tar­geted at peo­ple go­ing through a per­sonal crisis. Sleep de­pri­va­tion. In­duced fa­tigue from hard la­bor. Dis­tant com­munes to iso­late the re­cruit from friends and fam­ily. Daily meet­ings to con­fess im­pure thoughts. It’s not un­usual for cults to take all the re­cruit’s money—life sav­ings plus weekly pay­check—forc­ing them to de­pend on the cult for food and cloth­ing. Star­va­tion as a pun­ish­ment for di­s­obe­di­ence. Se­ri­ous brain­wash­ing and se­ri­ous harm.

With all that taken into ac­count, I should prob­a­bly sym­pa­thize more with peo­ple who are ter­ribly ner­vous, em­bark­ing on some odd-seem­ing en­deavor, that they might be join­ing a cult. It should not grate on my nerves. Which it does.

Point one: “Cults” and “non-cults” aren’t sep­a­rated nat­u­ral kinds like dogs and cats. If you look at any list of cult char­ac­ter­is­tics, you’ll see items that could eas­ily de­scribe poli­ti­cal par­ties and cor­po­ra­tions—“group mem­bers en­couraged to dis­trust out­side crit­i­cism as hav­ing hid­den mo­tives,” “hi­er­ar­chi­cal au­thor­i­ta­tive struc­ture.” I’ve writ­ten on group failure modes like group po­lariza­tion, happy death spirals, un­crit­i­cal­ity, and evap­o­ra­tive cool­ing, all of which seem to feed on each other. When these failures swirl to­gether and meet, they com­bine to form a Su­per-Failure stupi­der than any of the parts, like Voltron. But this is not a cult essence; it is a cult at­trac­tor.

Dogs are born with dog DNA, and cats are born with cat DNA. In the cur­rent world, there is no in-be­tween. (Even with ge­netic ma­nipu­la­tion, it wouldn’t be as sim­ple as cre­at­ing an or­ganism with half dog genes and half cat genes.) It’s not like there’s a mu­tu­ally re­in­forc­ing set of dog-char­ac­ter­is­tics, which an in­di­vi­d­ual cat can wan­der halfway into and be­come a semi­dog.

The hu­man mind, as it thinks about cat­e­gories, seems to pre­fer essences to at­trac­tors. The one wishes to say, “It is a cult,” or, “It is not a cult,” and then the task of clas­sifi­ca­tion is over and done. If you ob­serve that Socrates has ten fingers, wears clothes, and speaks fluent Greek, then you can say, “Socrates is hu­man,” and from there de­duce, “Socrates is vuln­er­a­ble to hem­lock,” with­out do­ing spe­cific blood tests to con­firm his mor­tal­ity. You have de­cided Socrates’s hu­man­ness once and for all.

But if you ob­serve that a cer­tain group of peo­ple seems to ex­hibit in­group-out­group po­lariza­tion and see a pos­i­tive halo effect around their Fa­vorite Thing Ever—which could be Ob­jec­tivism, or veg­e­tar­i­anism, or neu­ral net­works—you can­not, from the ev­i­dence gath­ered so far, de­duce whether they have achieved un­crit­i­cal­ity. You can­not de­duce whether their main idea is true, or false, or gen­uinely use­ful but not quite as use­ful as they think. From the in­for­ma­tion gath­ered so far, you can­not de­duce whether they are oth­er­wise po­lite, or if they will lure you into iso­la­tion and de­prive you of sleep and food. The char­ac­ter­is­tics of cult­ness are not all pre­sent or all ab­sent.

If you look at on­line ar­gu­ments over “X is a cult,” “X is not a cult,” then one side goes through an on­line list of cult char­ac­ter­is­tics and finds one that ap­plies and says, “There­fore it is a cult!” And the defen­der finds a char­ac­ter­is­tic that does not ap­ply and says, “There­fore it is not a cult!”

You can­not build up an ac­cu­rate pic­ture of a group’s rea­son­ing dy­namic us­ing this kind of es­sen­tial­ism. You’ve got to pay at­ten­tion to in­di­vi­d­ual char­ac­ter­is­tics in­di­vi­d­u­ally.

Fur­ther­more, re­versed stu­pidity is not in­tel­li­gence. If you’re in­ter­ested in the cen­tral idea, not just the im­ple­men­ta­tion group, then smart ideas can have stupid fol­low­ers. Lots of New Agers talk about “quan­tum physics,” but this is no strike against quan­tum physics.1 Along with bi­nary es­sen­tial­ism goes the idea that if you in­fer that a group is a “cult,” there­fore their be­liefs must be false, be­cause false be­liefs are char­ac­ter­is­tic of cults, just like cats have fur. If you’re in­ter­ested in the idea, then look at the idea, not the peo­ple. Cul­tish­ness is a char­ac­ter­is­tic of groups more than hy­pothe­ses.

The sec­ond er­ror is that when peo­ple ner­vously ask, “This isn’t a cult, is it?” it sounds to me like they’re seek­ing re­as­surance of ra­tio­nal­ity. The no­tion of a ra­tio­nal­ist not get­ting too at­tached to their self-image as a ra­tio­nal­ist de­serves its own es­say.2 But even with­out go­ing into de­tail, surely one can see that ner­vously seek­ing re­as­surance is not the best frame of mind in which to eval­u­ate ques­tions of ra­tio­nal­ity. You will not be gen­uinely cu­ri­ous or think of ways to fulfill your doubts. In­stead, you’ll find some on­line source which says that cults use sleep de­pri­va­tion to con­trol peo­ple, you’ll no­tice that Your-Fa­vorite-Group doesn’t use sleep de­pri­va­tion, and you’ll con­clude, “It’s not a cult. Whew!” If it doesn’t have fur, it must not be a cat. Very re­as­sur­ing.

But ev­ery cause wants to be a cult, whether the cause it­self is wise or fool­ish. The in­group-out­group di­chotomy, etc., are part of hu­man na­ture, not a spe­cial curse of mu­tants. Ra­tion­al­ity is the ex­cep­tion, not the rule. You have to put forth a con­stant effort to main­tain ra­tio­nal­ity against the nat­u­ral slide into en­tropy. If you de­cide, “It’s not a cult!” and sigh with re­lief, then you will not put forth a con­tin­u­ing effort to push back or­di­nary ten­den­cies to­ward cultish­ness. You’ll de­cide the cult-essence is ab­sent, and stop pump­ing against the en­tropy of the cult at­trac­tor.

If you are ter­ribly ner­vous about cultish­ness, then you will want to deny any hint of any char­ac­ter­is­tic that re­sem­bles a cult. But any group with a goal seen in a pos­i­tive light is at risk for the halo effect, and will have to pump against en­tropy to avoid an af­fec­tive death spiral. This is true even for or­di­nary in­sti­tu­tions like poli­ti­cal par­ties—peo­ple who think that “liberal val­ues” or “con­ser­va­tive val­ues” can cure can­cer, etc. It is true for Sili­con Valley star­tups, both failed and suc­cess­ful. It is true of Mac users and of Linux users. The halo effect doesn’t be­come okay just be­cause ev­ery­one does it; if ev­ery­one walks off a cliff, you wouldn’t too. The er­ror in rea­son­ing is to be fought, not tol­er­ated. But if you’re too ner­vous about, “Are you sure this isn’t a cult?” then you will be re­luc­tant to see any sign of cultish­ness, be­cause that would im­ply you’re in a cult, and It’s not a cult!! So you won’t see the cur­rent bat­tlefields where the or­di­nary ten­den­cies to­ward cultish­ness are creep­ing for­ward, or be­ing pushed back.

The third mis­take in ner­vously ask­ing, “This isn’t a cult, is it?” is that, I strongly sus­pect, the ner­vous­ness is there for en­tirely the wrong rea­sons.

Why is it that groups which praise their Happy Thing to the stars, en­courage mem­bers to donate all their money and work in vol­un­tary servi­tude, and run pri­vate com­pounds in which mem­bers are kept tightly se­cluded, are called “re­li­gions” rather than “cults” once they’ve been around for a few hun­dred years?

Why is it that most of the peo­ple who ner­vously ask of cry­on­ics, “This isn’t a cult, is it?” would not be equally ner­vous about at­tend­ing a Repub­li­can or Demo­cratic poli­ti­cal rally? In­group-out­group di­chotomies and happy death spirals can hap­pen in poli­ti­cal dis­cus­sion, in main­stream re­li­gions, in sports fan­dom. If the ner­vous­ness came from fear of ra­tio­nal­ity er­rors, peo­ple would ask, “This isn’t an in­group-out­group di­chotomy, is it?” about Demo­cratic or Repub­li­can poli­ti­cal ral­lies, in just the same fear­ful tones.

There’s a le­gi­t­i­mate rea­son to be less fear­ful of Liber­tar­i­anism than of a fly­ing-saucer cult, be­cause Liber­tar­i­ans don’t have a rep­u­ta­tion for em­ploy­ing sleep de­pri­va­tion to con­vert peo­ple. But cry­on­i­cists don’t have a rep­u­ta­tion for us­ing sleep de­pri­va­tion, ei­ther. So why be any more wor­ried about hav­ing your head frozen af­ter you stop breath­ing?

I sus­pect that the ner­vous­ness is not the fear of be­liev­ing falsely, or the fear of phys­i­cal harm. It is the fear of lonely dis­sent. The ner­vous feel­ing that sub­jects get in Asch’s con­for­mity ex­per­i­ment, when all the other sub­jects (ac­tu­ally con­fed­er­ates) say one af­ter an­other that line C is the same size as line X, and it looks to the sub­ject like line B is the same size as line X. The fear of leav­ing the pack.

That’s why groups whose be­liefs have been around long enough to seem “nor­mal” don’t in­spire the same ner­vous­ness as “cults,” though some main­stream re­li­gions may also take all your money and send you to a monastery. It’s why groups like poli­ti­cal par­ties, that are strongly li­able for ra­tio­nal­ity er­rors, don’t in­spire the same ner­vous­ness as “cults.” The word “cult” isn’t be­ing used to sym­bol­ize ra­tio­nal­ity er­rors; it’s be­ing used as a la­bel for some­thing that seems weird.

Not ev­ery change is an im­prove­ment, but ev­ery im­prove­ment is nec­es­sar­ily a change. That which you want to do bet­ter, you have no choice but to do differ­ently. Com­mon wis­dom does em­body a fair amount of, well, ac­tual wis­dom; yes, it makes sense to re­quire an ex­tra bur­den of proof for weird­ness. But the ner­vous­ness isn’t that kind of de­liber­ate, ra­tio­nal con­sid­er­a­tion. It’s the fear of be­liev­ing some­thing that will make your friends look at you re­ally oddly. And so peo­ple ask, “This isn’t a cult, is it?” in a tone that they would never use for at­tend­ing a poli­ti­cal rally, or for putting up a gi­gan­tic Christ­mas dis­play.

That’s the part that bugs me.

It’s as if, as soon as you be­lieve any­thing that your an­ces­tors did not be­lieve, the Cult Fairy comes down from the sky and in­fuses you with the Essence of Cult­ness, and the next thing you know, you’re all wear­ing robes and chant­ing. As if “weird” be­liefs are the di­rect cause of the prob­lems, never mind the sleep de­pri­va­tion and beat­ings. The harm done by cults—the Heaven’s Gate suicide and so on—just goes to show that ev­ery­one with an odd be­lief is crazy; the first and fore­most char­ac­ter­is­tic of “cult mem­bers” is that they are Out­siders with Pe­cu­liar Ways.

Yes, so­cially un­usual be­lief puts a group at risk for in­group-out­group think­ing and evap­o­ra­tive cool­ing and other prob­lems. But the un­usu­al­ness is a risk fac­tor, not a dis­ease in it­self. Same thing with hav­ing a goal that you think is worth ac­com­plish­ing. Whether or not the be­lief is true, hav­ing a nice goal always puts you at risk of the happy death spiral. But that makes lofty goals a risk fac­tor, not a dis­ease. Some goals are gen­uinely worth pur­su­ing.3

Prob­lem four: The fear of lonely dis­sent is some­thing that cults them­selves ex­ploit. Be­ing afraid of your friends look­ing at you dis­ap­prov­ingly is ex­actly the effect that real cults use to con­vert and keep mem­bers—sur­round­ing con­verts with wall-to-wall agree­ment among cult be­liev­ers.

The fear of strange ideas, the im­pulse to con­for­mity, has no doubt warned many po­ten­tial vic­tims away from fly­ing saucer cults. When you’re out, it keeps you out. But when you’re in, it keeps you in. Con­for­mity just glues you to wher­ever you are, whether that’s a good place or a bad place.

The one wishes there was some way they could be sure that they weren’t in a “cult.” Some definite, crush­ing re­join­der to peo­ple who looked at them funny. Some way they could know once and for all that they were do­ing the right thing, with­out these con­stant doubts. I be­lieve that’s called “need for clo­sure.” And—of course—cults ex­ploit that, too.

Hence the phrase “cultish coun­ter­cultish­ness.”

Liv­ing with doubt is not a virtue—the pur­pose of ev­ery doubt is to an­nihilate it­self in suc­cess or failure, and a doubt that just hangs around ac­com­plishes noth­ing. But some­times a doubt does take a while to an­nihilate it­self. Liv­ing with a stack of cur­rently un­re­solved doubts is an un­avoid­able fact of life for ra­tio­nal­ists. Doubt shouldn’t be scary. Other­wise you’re go­ing to have to choose be­tween liv­ing one heck of a hunted life, or one heck of a stupid one.

If you re­ally, gen­uinely can’t figure out whether a group is a “cult,” then you’ll just have to choose un­der con­di­tions of un­cer­tainty. That’s what de­ci­sion the­ory is all about.

Prob­lem five: Lack of strate­gic think­ing.

I know peo­ple who are cau­tious around ideas like in­tel­li­gence ex­plo­sion and su­per­in­tel­li­gent AI, and they’re also cau­tious around poli­ti­cal par­ties and main­stream re­li­gions. Cau­tious, not ner­vous or defen­sive. Th­ese peo­ple can see at a glance that sin­gu­lar­ity-ish ideas aren’t cur­rently the nu­cleus of a full-blown cult with sleep de­pri­va­tion, etc. But they worry that it will be­come a cult, be­cause of risk fac­tors like turn­ing the con­cept of a pow­er­ful AI into a Su­per Happy Agent (an agent defined pri­mar­ily by agree­ing with any nice thing said about it). Just be­cause some­thing isn’t a cult now doesn’t mean it won’t be­come a cult in the fu­ture. Cul­tish­ness is an at­trac­tor, not an essence.

Does this kind of cau­tion an­noy me? Hell no. I spend a lot of time wor­ry­ing about that sce­nario my­self. I try to place my Go stones in ad­vance to block move­ment in that di­rec­tion.4

Peo­ple who talk about “ra­tio­nal­ity” also have an added risk fac­tor. Giv­ing peo­ple ad­vice about how to think is an in­her­ently dan­ger­ous busi­ness. But it is a risk fac­tor, not a dis­ease.

Both of my fa­vorite Causes are at-risk for cultish­ness. Yet some­how I get asked, “Are you sure this isn’t a cult?” a lot more of­ten when I talk about pow­er­ful AIs than when I talk about prob­a­bil­ity the­ory and cog­ni­tive sci­ence. I don’t know if one risk fac­tor is higher than the other, but I know which one sounds weirder . . .

Prob­lem #6 with ask­ing, “This isn’t a cult, is it?” . . .

Just the ques­tion it­self places me in a very an­noy­ing sort of Catch-22. An ac­tual Evil Guru would surely use the one’s ner­vous­ness against them, and de­sign a plau­si­ble elab­o­rate ar­gu­ment ex­plain­ing Why This Is Not A Cult, and the one would be ea­ger to ac­cept it. Some­times I get the im­pres­sion that this is what peo­ple want me to do! When­ever I try to write about cultish­ness and how to avoid it, I keep feel­ing like I’m giv­ing in to that flawed de­sire—that I am, in the end, pro­vid­ing peo­ple with re­as­surance. Even when I tell peo­ple that a con­stant fight against en­tropy is re­quired.

It feels like I’m mak­ing my­self a first dis­sen­ter in Asch’s con­for­mity ex­per­i­ment, tel­ling peo­ple, “Yes, line X re­ally is the same as line B, it’s okay for you to say so too.” They shouldn’t need to ask! Or, even worse, it feels like I’m pre­sent­ing an elab­o­rate ar­gu­ment for Why This Is Not A Cult. It’s a wrong ques­tion.

Just look at the group’s rea­son­ing pro­cesses for your­self, and de­cide for your­self whether it’s some­thing you want to be part of, once you get rid of the fear of weird­ness. It is your own re­spon­si­bil­ity to stop your­self from think­ing cultishly, no mat­ter which group you cur­rently hap­pen to be op­er­at­ing in.

Cults feed on group­think, ner­vous­ness, de­sire for re­as­surance. You can­not make ner­vous­ness go away by wish­ing, and false self-con­fi­dence is even worse. But so long as some­one needs re­as­surance—even re­as­surance about be­ing a ra­tio­nal­ist—that will always be a flaw in their ar­mor. A skil­lful swords­man fo­cuses on the tar­get, rather than glanc­ing away to see if any­one might be laugh­ing. When you know what you’re try­ing to do and why, you’ll know whether you’re get­ting it done or not, and whether a group is helping you or hin­der­ing you.5

1Of course, stupid ideas can also have stupid fol­low­ers.

2Though see the two cult koans, “Why Truth?” (in Map and Ter­ri­tory), and “The Twelve Virtues of Ra­tion­al­ity” (http://​​www.less­wrong.com/​​ra­tio­nal­ity/​​the-twelve-virtues-of-ra­tio­nal­ity).

3On the other hand, I see no le­gi­t­i­mate rea­son for sleep de­pri­va­tion or threat­en­ing dis­sen­ters with beat­ing, full stop. When a group does this, then whether you call it “cult” or “not-cult,” you have di­rectly an­swered the prag­matic ques­tion of whether to join.

4Hence, for ex­am­ple, the se­ries of es­says on cultish failures of rea­son­ing.

5PS: If the one comes to you and says, “Are you sure this isn’t a cult?” don’t try to ex­plain all these con­cepts in one breath. You’re un­der­es­ti­mat­ing in­fer­en­tial dis­tances. The one will say, “Aha, so you’re ad­mit­ting you’re a cult!” or, “Wait, you’re say­ing I shouldn’t worry about join­ing cults?” or, “So . . . the fear of cults is cultish? That sounds awfully cultish to me.”

So the last an­noy­ance fac­tor—#7 if you’re keep­ing count—is that all of this is such a long story to ex­plain.