Every Cause Wants To Be A Cult

Cade Metz at The Register re­cently alleged that a se­cret mailing list of Wikipe­dia’s top ad­minis­tra­tors has be­come ob­sessed with ban­ning all crit­ics and pos­si­ble crit­ics of Wikipe­dia. In­clud­ing ban­ning a pro­duc­tive user when one ad­minis­tra­tor—solely be­cause of the pro­duc­tivity—be­came con­vinced that the user was a spy sent by Wikipe­dia Re­view. And that the top peo­ple at Wikipe­dia closed ranks to defend their own. (I have not in­ves­ti­gated these alle­ga­tions my­self, as yet. Hat tip to Eu­gen Leitl.)

Is there some deep moral flaw in seek­ing to sys­tem­atize the world’s knowl­edge, which would lead pur­suers of that Cause into mad­ness? Per­haps only peo­ple with in­nately to­tal­i­tar­ian ten­den­cies would try to be­come the world’s au­thor­ity on ev­ery­thing—

Cor­re­spon­dence bias alert! (Cor­re­spon­dence bias: mak­ing in­fer­ences about some­one’s unique dis­po­si­tion from be­hav­ior that can be en­tirely ex­plained by the situ­a­tion in which it oc­curs. When we see some­one else kick a vend­ing ma­chine, we think they are “an an­gry per­son”, but when we kick the vend­ing ma­chine, it’s be­cause the bus was late, the train was early and the ma­chine ate our money.) If the alle­ga­tions about Wikipe­dia are true, they’re ex­plained by or­di­nary hu­man na­ture, not by ex­traor­di­nary hu­man na­ture.

The in­group-out­group di­chotomy is part of or­di­nary hu­man na­ture. So are happy death spirals and spirals of hate. A Noble Cause doesn’t need a deep hid­den flaw for its ad­her­ents to form a cultish in-group. It is suffi­cient that the ad­her­ents be hu­man. Every­thing else fol­lows nat­u­rally, de­cay by de­fault, like food spoiling in a re­friger­a­tor af­ter the elec­tric­ity goes off.

In the same sense that ev­ery ther­mal differ­en­tial wants to equal­ize it­self, and ev­ery com­puter pro­gram wants to be­come a col­lec­tion of ad-hoc patches, ev­ery Cause wants to be a cult. It’s a high-en­tropy state into which the sys­tem trends, an at­trac­tor in hu­man psy­chol­ogy. It may have noth­ing to do with whether the Cause is truly Noble. You might think that a Good Cause would rub off its good­ness on ev­ery as­pect of the peo­ple as­so­ci­ated with it—that the Cause’s fol­low­ers would also be less sus­cep­ti­ble to sta­tus games, in­group-out­group bias, af­fec­tive spirals, leader-gods. But be­liev­ing one true idea won’t switch off the halo effect. A no­ble cause won’t make its ad­her­ents some­thing other than hu­man. There are plenty of bad ideas that can do plenty of dam­age—but that’s not nec­es­sar­ily what’s go­ing on.

Every group of peo­ple with an un­usual goal—good, bad, or silly—will trend to­ward the cult at­trac­tor un­less they make a con­stant effort to re­sist it. You can keep your house cooler than the out­doors, but you have to run the air con­di­tioner con­stantly, and as soon as you turn off the elec­tric­ity—give up the fight against en­tropy—things will go back to “nor­mal”.

On one no­table oc­ca­sion there was a group that went semicultish whose ral­ly­ing cry was “Ra­tion­al­ity! Rea­son! Ob­jec­tive re­al­ity!” (More on this in fu­ture posts.) La­bel­ing the Great Idea “ra­tio­nal­ity” won’t pro­tect you any more than putting up a sign over your house that says “Cold!” You still have to run the air con­di­tioner—ex­pend the re­quired en­ergy per unit time to re­verse the nat­u­ral slide into cultish­ness. Wor­ship­ping ra­tio­nal­ity won’t make you sane any more than wor­ship­ping grav­ity en­ables you to fly. You can’t talk to ther­mo­dy­nam­ics and you can’t pray to prob­a­bil­ity the­ory. You can use it, but not join it as an in-group.

Cul­tish­ness is quan­ti­ta­tive, not qual­i­ta­tive. The ques­tion is not “Cul­tish, yes or no?” but “How much cultish­ness and where?” Even in Science, which is the archety­pal Gen­uinely Truly Noble Cause, we can read­ily point to the cur­rent fron­tiers of the war against cult-en­tropy, where the cur­rent bat­tle line creeps for­ward and back. Are jour­nals more likely to ac­cept ar­ti­cles with a well-known au­tho­rial byline, or from an un­known au­thor from a well-known in­sti­tu­tion, com­pared to an un­known au­thor from an un­known in­sti­tu­tion? How much be­lief is due to au­thor­ity and how much is from the ex­per­i­ment? Which jour­nals are us­ing blinded re­view­ers, and how effec­tive is blinded re­view­ing?

I cite this ex­am­ple, rather than the stan­dard vague ac­cu­sa­tions of “Scien­tists aren’t open to new ideas”, be­cause it shows a bat­tle line—a place where hu­man psy­chol­ogy is be­ing ac­tively driven back, where ac­cu­mu­lated cult-en­tropy is be­ing pumped out. (Of course this re­quires emit­ting some waste heat.)

This post is not a cat­a­log of tech­niques for ac­tively pump­ing against cultish­ness. Some such tech­niques I have said be­fore, and some I will say later. To­day I just want to point out that the wor­thi­ness of the Cause does not mean you can spend any less effort in re­sist­ing the cult at­trac­tor. And that if you can point to cur­rent bat­tle lines, it does not mean you con­fess your Noble Cause un­wor­thy. You might think that if the ques­tion were “Cul­tish, yes or no?” that you were obliged to an­swer “No”, or else be­tray your be­loved Cause. But that is like think­ing that you should di­vide en­g­ines into “perfectly effi­cient” and “in­effi­cient”, in­stead of mea­sur­ing waste.

Con­trar­i­wise, if you be­lieve that it was the In­her­ent Im­pu­rity of those Fool­ish Other Causes that made them go wrong, if you laugh at the folly of “cult vic­tims”, if you think that cults are led and pop­u­lated by mu­tants, then you will not ex­pend the nec­es­sary effort to pump against en­tropy—to re­sist be­ing hu­man.