Evaporative Cooling of Group Beliefs
Early studiers of cults were surprised to discover that when cults receive a major shock—a prophecy fails to come true, a moral flaw of the founder is revealed—they often come back stronger than before, with increased belief and fanaticism. The Jehovah’s Witnesses placed Armageddon in 1975, based on Biblical calculations; 1975 has come and passed. The Unarian cult, still going strong today, survived the nonappearance of an intergalactic spacefleet on September 27, 1975.
Why would a group belief become stronger after encountering crushing counterevidence?
The conventional interpretation of this phenomenon is based on cognitive dissonance. When people have taken “irrevocable” actions in the service of a belief—given away all their property in anticipation of the saucers landing—they cannot possibly admit they were mistaken. The challenge to their belief presents an immense cognitive dissonance; they must find reinforcing thoughts to counter the shock, and so become more fanatical. In this interpretation, the increased group fanaticism is the result of increased individual fanaticism.
I was looking at a Java applet which demonstrates the use of evaporative cooling to form a Bose-Einstein condensate, when it occurred to me that another force entirely might operate to increase fanaticism. Evaporative cooling sets up a potential energy barrier around a collection of hot atoms. Thermal energy is essentially statistical in nature—not all atoms are moving at the exact same speed. The kinetic energy of any given atom varies as the atoms collide with each other. If you set up a potential energy barrier that’s just a little higher than the average thermal energy, the workings of chance will give an occasional atom a kinetic energy high enough to escape the trap. When an unusually fast atom escapes, it takes with it an unusually large amount of kinetic energy, and the average energy decreases. The group becomes substantially cooler than the potential energy barrier around it.
In Festinger, Riecken, and Schachter’s classic When Prophecy Fails, one of the cult members walked out the door immediately after the flying saucer failed to land. Who gets fed up and leaves first? An average cult member? Or a relatively skeptical member, who previously might have been acting as a voice of moderation, a brake on the more fanatic members?
After the members with the highest kinetic energy escape, the remaining discussions will be between the extreme fanatics on one end and the slightly less extreme fanatics on the other end, with the group consensus somewhere in the “middle.”
And what would be the analogy to collapsing to form a Bose-Einstein condensate? Well, there’s no real need to stretch the analogy that far. But you may recall that I used a fission chain reaction analogy for the affective death spiral; when a group ejects all its voices of moderation, then all the people encouraging each other, and suppressing dissents, may internally increase in average fanaticism.1
When Ayn Rand’s long-running affair with Nathaniel Branden was revealed to the Objectivist membership, a substantial fraction of the Objectivist membership broke off and followed Branden into espousing an “open system” of Objectivism not bound so tightly to Ayn Rand. Who stayed with Ayn Rand even after the scandal broke? The ones who really, really believed in her—and perhaps some of the undecideds, who, after the voices of moderation left, heard arguments from only one side. This may account for how the Ayn Rand Institute is (reportedly) more fanatical after the breakup than the original core group of Objectivists under Branden and Rand.
A few years back, I was on a transhumanist mailing list where a small group espousing “social democratic transhumanism” vitriolically insulted every libertarian on the list. Most libertarians left the mailing list; most of the others gave up on posting. As a result, the remaining group shifted substantially to the left. Was this deliberate? Probably not, because I don’t think the perpetrators knew that much psychology.2 At most, they might have thought to make themselves “bigger fish in a smaller pond.”
This is one reason why it’s important to be prejudiced in favor of tolerating dissent. Wait until substantially after it seems to you justified in ejecting a member from the group, before actually ejecting. If you get rid of the old outliers, the group position will shift, and someone else will become the oddball. If you eject them too, you’re well on the way to becoming a Bose-Einstein condensate and, er, exploding.
The flip side: Thomas Kuhn believed that a science has to become a “paradigm,” with a shared technical language that excludes outsiders, before it can get any real work done. In the formative stages of a science, according to Kuhn, the adherents go to great pains to make their work comprehensible to outside academics. But (according to Kuhn) a science can only make real progress as a technical discipline once it abandons the requirement of outside accessibility, and scientists working in the paradigm assume familiarity with large cores of technical material in their communications. This sounds cynical, relative to what is usually said about public understanding of science, but I can definitely see a core of truth here.3
1No thermodynamic analogy here, unless someone develops a nuclear weapon that explodes when it gets cold.
2For that matter, I can’t recall seeing the evaporative cooling analogy elsewhere, though that doesn’t mean it hasn’t been noted before.
3My own theory of Internet moderation is that you have to be willing to exclude trolls and spam to get a conversation going. You must even be willing to exclude kindly but technically uninformed folks from technical mailing lists if you want to get any work done. A genuinely open conversation on the Internet degenerates fast.
It’s the articulate trolls that you should be wary of ejecting, on this theory—they serve the hidden function of legitimizing less extreme disagreements. But you should not have so many articulate trolls that they begin arguing with each other, or begin to dominate conversations. If you have one person around who is the famous Guy Who Disagrees With Everything, anyone with a more reasonable, more moderate disagreement won’t look like the sole nail sticking out. This theory of Internet moderation may not have served me too well in practice, so take it with a grain of salt.
- Bad Omens in Current Community Building by 12 May 2022 15:41 UTC; 571 points) (EA Forum;
- Why Our Kind Can’t Cooperate by 20 Mar 2009 8:37 UTC; 291 points) (
- Concentration of Force by 6 Nov 2021 8:20 UTC; 238 points) (
- Motivation gaps: Why so much EA criticism is hostile and lazy by 22 Apr 2024 11:49 UTC; 211 points) (EA Forum;
- The Vultures Are Circling by 6 Apr 2022 5:57 UTC; 198 points) (EA Forum;
- My tentative best guess on how EAs and Rationalists sometimes turn crazy by 21 Jun 2023 4:11 UTC; 194 points) (
- My tentative best guess on how EAs and Rationalists sometimes turn crazy by 21 Jun 2023 4:11 UTC; 160 points) (EA Forum;
- Dark Forest Theories by 12 May 2023 20:21 UTC; 137 points) (
- Guardians of Ayn Rand by 18 Dec 2007 6:24 UTC; 118 points) (
- EA Diversity: Unpacking Pandora’s Box by 1 Feb 2015 0:40 UTC; 84 points) (EA Forum;
- Exposure to Lizardman is Lethal by 2 Apr 2023 18:57 UTC; 84 points) (
- Is Stupidity Expanding? Some Hypotheses. by 15 Oct 2020 3:28 UTC; 71 points) (
- EA is probably undergoing “Evaporative Cooling” right now by 12 Dec 2022 12:35 UTC; 69 points) (EA Forum;
- Motivation gaps: Why so much EA criticism is hostile and lazy by 22 Apr 2024 11:49 UTC; 69 points) (
- 4 Dec 2013 8:49 UTC; 67 points) 's comment on Open Thread, December 2-8, 2013 by (
- There Should Be More Alignment-Driven Startups by 31 May 2024 2:05 UTC; 60 points) (
- Are there diseconomies of scale in the reputation of communities? by 27 Jul 2023 18:43 UTC; 52 points) (EA Forum;
- Be less trusting of intuitive arguments about social phenomena by 18 Dec 2022 1:11 UTC; 43 points) (EA Forum;
- The Value (and Danger) of Ritual by 30 Dec 2011 6:52 UTC; 43 points) (
- How concerned are you about LW reputation management? by 17 May 2021 2:11 UTC; 42 points) (
- Jobs, Relationships, and Other Cults by 13 Mar 2024 5:58 UTC; 40 points) (
- A Suggested Reading Order for Less Wrong [2011] by 8 Jul 2011 1:40 UTC; 38 points) (
- Why EAs researching mainstream topics can be useful by 13 Jun 2021 10:14 UTC; 37 points) (EA Forum;
- 17 Oct 2020 18:26 UTC; 35 points) 's comment on When does it make sense to support/oppose political candidates on EA grounds? by (EA Forum;
- 14 Jan 2023 18:08 UTC; 34 points) 's comment on A general comment on discussions of genetic group differences by (
- 28 May 2024 13:20 UTC; 32 points) 's comment on EA Consulting Network is now Consultants for Impact by (EA Forum;
- Why I’m Staying On Bloggingheads.tv by 7 Sep 2009 20:15 UTC; 31 points) (
- Evaporative cooling of group beliefs: current example by 31 Dec 2011 21:21 UTC; 30 points) (
- 10 Apr 2023 23:21 UTC; 28 points) 's comment on EA & “The correct response to uncertainty is *not* half-speed” by (EA Forum;
- Don’t call yourself a rationalist. by 14 Oct 2011 20:26 UTC; 28 points) (
- 16 Dec 2016 14:30 UTC; 26 points) 's comment on Circles of discussion by (
- Less Lightning Links at Length #1 by 24 Feb 2022 16:20 UTC; 26 points) (
- 1 Sep 2011 19:32 UTC; 24 points) 's comment on . by (
- 27 Jan 2012 13:19 UTC; 24 points) 's comment on What’s going on here? by (
- Quantitative Philosophy: Why Simulate Ideas Numerically? by 14 Apr 2019 3:53 UTC; 22 points) (
- 21 Feb 2014 9:57 UTC; 18 points) 's comment on Open Thread for February 18-24 2014 by (
- 21 Nov 2013 4:13 UTC; 16 points) 's comment on Self-serving meta: Whoever keeps block-downvoting me, is there some way to negotiate peace? by (
- 4 Dec 2013 23:16 UTC; 15 points) 's comment on Open Thread, December 2-8, 2013 by (
- 28 Oct 2017 1:41 UTC; 14 points) 's comment on Why & How to Make Progress on Diversity & Inclusion in EA by (EA Forum;
- 21 Jan 2023 3:23 UTC; 13 points) 's comment on FLI FAQ on the rejected grant proposal controversy by (EA Forum;
- 2 Feb 2024 0:44 UTC; 13 points) 's comment on Managing risks while trying to do good by (EA Forum;
- 26 Dec 2012 23:32 UTC; 13 points) 's comment on META: Deletion policy by (
- The Outside Critics of Effective Altruism by 5 Jan 2015 18:37 UTC; 12 points) (EA Forum;
- 9 Aug 2022 18:03 UTC; 12 points) 's comment on Are “Bad People” Really Unwelcome in EA? by (EA Forum;
- Where’s the economic incentive for wokism coming from? by 8 Dec 2022 23:28 UTC; 12 points) (
- 5 Nov 2010 18:35 UTC; 12 points) 's comment on Religious/Worldview Techniques by (
- 2 Jun 2020 20:45 UTC; 12 points) 's comment on Book of Mormon Discussion by (
- 31 Aug 2012 16:58 UTC; 11 points) 's comment on Dealing with trolling and the signal to noise ratio by (
- 4 Apr 2011 6:59 UTC; 11 points) 's comment on Recent de-convert saturated by religious community; advice? by (
- 24 Dec 2012 12:09 UTC; 10 points) 's comment on New censorship: against hypothetical violence against identifiable people by (
- Cross-Cultural maps and Asch’s Conformity Experiment by 9 Mar 2016 0:40 UTC; 10 points) (
- 21 Jun 2009 12:56 UTC; 10 points) 's comment on ESR’s comments on some EY:OB/LW posts by (
- 13 Aug 2012 18:08 UTC; 8 points) 's comment on What is moral foundation theory good for? by (
- [SEQ RERUN] Evaporative Cooling of Group Beliefs by 18 Nov 2011 5:16 UTC; 8 points) (
- 21 Apr 2011 22:26 UTC; 8 points) 's comment on Epistle to the New York Less Wrongians by (
- 11 Sep 2017 22:40 UTC; 8 points) 's comment on New business opportunities due to self-driving cars by (
- 14 Mar 2019 1:58 UTC; 7 points) 's comment on The Importance of Truth-Oriented Discussions in EA by (EA Forum;
- 27 Jun 2012 5:09 UTC; 7 points) 's comment on [SEQ RERUN] Where Recursive Justification Hits Bottom by (
- 14 Mar 2008 3:06 UTC; 7 points) 's comment on Penguicon & Blook by (
- 8 Aug 2011 22:08 UTC; 7 points) 's comment on Raise the Age Demographic by (
- 11 Dec 2010 6:38 UTC; 7 points) 's comment on Best career models for doing research? by (
- Rationality Reading Group: Part J: Death Spirals by 24 Sep 2015 2:31 UTC; 7 points) (
- 4 Nov 2010 19:14 UTC; 6 points) 's comment on Rationality Quotes: November 2010 by (
- 24 Dec 2022 23:31 UTC; 5 points) 's comment on Read The Sequences by (EA Forum;
- 21 Jan 2023 9:17 UTC; 5 points) 's comment on The ones that walk away by (EA Forum;
- 14 Mar 2013 13:28 UTC; 5 points) 's comment on Don’t Get Offended by (
- 23 Nov 2011 17:43 UTC; 5 points) 's comment on Where do I most obviously still need to say “oops”? by (
- 29 Apr 2011 1:13 UTC; 5 points) 's comment on Is Kiryas Joel an Unhappy Place? by (
- 19 Nov 2010 19:17 UTC; 5 points) 's comment on “Target audience” size for the Less Wrong sequences by (
- 14 Jun 2013 10:49 UTC; 5 points) 's comment on Changing Systems is Different than Running Controlled Experiments—Don’t Choose How to Run Your Country That Way! by (
- 21 Jun 2012 17:58 UTC; 5 points) 's comment on Open Thread, June 1-15, 2012 by (
- 4 May 2023 8:08 UTC; 5 points) 's comment on AGI rising: why we are in a new era of acute risk and increasing public awareness, and what to do now by (
- 14 Jun 2010 21:53 UTC; 5 points) 's comment on Open Thread June 2010, Part 3 by (
- 14 Apr 2013 18:49 UTC; 5 points) 's comment on Welcome to Less Wrong! (5th thread, March 2013) by (
- 30 May 2022 20:18 UTC; 5 points) 's comment on Will working here advance AGI? Help us not destroy the world! by (
- 25 May 2023 10:23 UTC; 5 points) 's comment on Adumbrations on AGI from an outsider by (
- 18 Jan 2016 12:30 UTC; 4 points) 's comment on Rationality Quotes Thread January 2016 by (
- 23 Oct 2011 23:23 UTC; 4 points) 's comment on Practicing what you preach by (
- 21 Jul 2015 4:19 UTC; 4 points) 's comment on Welcome to Less Wrong! (7th thread, December 2014) by (
- Minneapolis-St Paul ACX Article Club: Problem Actors and Groups! by 13 Apr 2024 17:57 UTC; 3 points) (
- 14 Apr 2009 23:12 UTC; 3 points) 's comment on Tell it to someone who doesn’t care by (
- 23 Oct 2017 8:26 UTC; 3 points) 's comment on Seek Fair Expectations of Others’ Models by (
- Climbing the Horseshoe by 23 Nov 2016 2:54 UTC; 3 points) (
- 3 Apr 2012 9:55 UTC; 3 points) 's comment on Cryonics on LessWrong vs at LessWrong meetups by (
- 29 Dec 2016 15:02 UTC; 3 points) 's comment on The Adventure: a new Utopia story by (
- 10 Oct 2016 18:14 UTC; 3 points) 's comment on Open thread, Oct. 10 - Oct. 16, 2016 by (
- 15 Apr 2012 0:01 UTC; 3 points) 's comment on Our Phyg Is Not Exclusive Enough by (
- 12 Jun 2009 20:25 UTC; 3 points) 's comment on Typical Mind and Politics by (
- 20 Jul 2019 19:06 UTC; 2 points) 's comment on Appeal to Consequence, Value Tensions, And Robust Organizations by (
- 29 Jul 2021 21:59 UTC; 2 points) 's comment on Covid 7/15: Rates of Change by (
- 30 Jul 2021 17:37 UTC; 2 points) 's comment on Covid 7/15: Rates of Change by (
- 26 Nov 2011 3:43 UTC; 2 points) 's comment on Welcome to Less Wrong! (2010-2011) by (
- 7 Dec 2015 2:37 UTC; 2 points) 's comment on Stupid Questions, December 2015 by (
- 24 Apr 2023 4:18 UTC; 1 point) 's comment on RomanHauksson’s Shortform by (
- 23 Jan 2019 12:06 UTC; 1 point) 's comment on One Website To Rule Them All? by (
- 23 Nov 2011 19:30 UTC; 1 point) 's comment on Where do I most obviously still need to say “oops”? by (
- 9 Nov 2021 10:02 UTC; 1 point) 's comment on Everything Studies on Cynical Theories by (
- 17 Oct 2011 14:20 UTC; 1 point) 's comment on Don’t call yourself a rationalist. by (
- Inflated Bubbles by 23 Dec 2016 16:55 UTC; 1 point) (
- 21 Dec 2023 10:50 UTC; 1 point) 's comment on The Hard Work of Translation (Buddhism) by (
- 12 Apr 2012 2:42 UTC; 1 point) 's comment on against “AI risk” by (
- 14 Oct 2019 4:31 UTC; 1 point) 's comment on Open & Welcome Thread—October 2019 by (
- 16 Aug 2013 16:06 UTC; 0 points) 's comment on Rationality Quotes August 2013 by (
- 2 Nov 2014 13:35 UTC; 0 points) 's comment on Open thread, Oct. 27 - Nov. 2, 2014 by (
- 23 Oct 2014 15:38 UTC; 0 points) 's comment on Four things every community should do by (
- 4 Jul 2012 5:24 UTC; -1 points) 's comment on Thoughts on moral intuitions by (
- 15 Sep 2013 14:26 UTC; -6 points) 's comment on Notes on Brainwashing & ‘Cults’ by (
If you eject them too, you’re well on the way to becoming a Bose-Einstein condensate and, er, exploding.
Hmmm… to continue the metaphor more directly, their brains get frozen and can no longer update their beliefs?
Apparently debiasing techniques and encouragement of dissent played a role in the American re-evaluation of the Iranian nuclear program:
http://www.nytimes.com/2007/12/05/washington/05intel.html?ref=middleeast “Over the past year, officials have put into place rigorous new procedures for analyzing conclusions about difficult intelligence targets like Iran, North Korea, global terrorism and China.
Analysts from disparate spy agencies are no longer pushed to achieve unanimity in their conclusions, a process criticized in the past for leading to “groupthink.” Alternate judgments are now encouraged.
In the case of the 2007 Iran report, “red teams” were established to test and find weaknesses in the report’s conclusions. Counterintelligence officials at the C.I.A. also did an extensive analysis to determine whether the new information might have been planted by Tehran to throw the United States off the trail of Iran’s nuclear program.
One result was an intelligence report that some of the intelligence community’s consistent critics have embraced.”
There’s no particular reason to want to continue that metaphor more directly. Only mathematically precise metaphors have that kind of power.
I actually just got confused and thought of “exploding” as (the group ends in fireworks). It’s a bit unintuitive that this meant the opposite (the group becomes unchanging).
You don’t believe in the artistic value of a beautifully extended metaphor?
Unquestionably, things get done a lot more by groups of people who are very much alike. Differences in opinions only tend to brake things.
The question is not whether you need people who are different in order to brake the group. The question is whether you’re in the right group to begin with. As per Kuhn, things will get done faster and better if members of the group share a lot of commonalities.
If you’re in the right group, excluding dissenters will allow you to progress faster. But if you’re in the wrong group, then you’re going to be making progress towards the wrong things.
Have you read Bion’s “Experiences in Groups”? He was an English Freudian, so he was extremely passive while observing group behavior, which is fine, because he was also careful to record what was happening.
I am less satisfied with his analysis, because, as a typical Freudian, he always has ad-hoc reasons why any piece of evidence (or its exact opposite) perfectly confirms his theories. Absolutely impossible to falsify.
What I took from it was that, after you establish a concrete, positive goal for a group’s interactions, for each and every sub-element of the goal, you can find some element of the human personal dynamic, or the human group dynamic, that will work against it.
It is a strong statement along the lines of Murphy’s Law: “Whatever can go wrong, will go wrong”. If it is a sub-element of a concrete, positive goal, there will be, in opposition, some element of the human personal dynamic, or the human group dynamic. It is a strong statement you can use to predict failure modes of your particular work group.
* If great progress is being made by the group, the individual identities of the members will lessen in importance. So people will assert their individual identities through disruption, gaining attention.
* If progress of the group demands full attention on the final objective, the group will become paranoid and invent some internal or external bug-a-boo to focus on instead.
My personal take, informed by Buddhism, is that this is not necessarily a bad thing. There should be some push-back on the goals of working groups. 99 out of 100 ideas turn out to be terrible ideas, and so it is a good thing they die from the irrational failure modes of the human personal dynamic or the human group dynamic. If it is a good idea, then it is worth to take care from assaults from human irrationality.
And also, attempting to attack head-on any particular irrational failure mode will only make it stronger. For example, a troll will never have so many defenders as when the group leaders focus in to remove him. Better to use Jujutsu. (Trolls are best countered with neglect leading to boredom.)
Bion’s “Experiences in Groups” will give a good sample of failure modes. You can attempt to skillfully steer around them, and keep the group working on the positive goal.
Yours is an interesting idea, keeping a “token” troll around. I would make a rule: any discipline against a troll will be matched by identical discipline against anyone who engages that troll, even in attack. “Feeding the troll” will have precisely the same sanctions as trolling itself.
I’ve believed this was so for a long time, after reading about the Klu Klux Klan. Which, it is weird to think, was once a (for the time) moderate and respected political institution. After the Grand Dragon got caught in a sex scandal, membership dwindled rapidly, leaving a core of fanatics—with results that I think we’re all quite familiar with.
I think the schism in Objectivism ended differently—because I don’t think on the one hand you have the fanatics, but rather simply two groups, each of which ended up defining Objectivism differently. One treats Objectivism as a complete value set, by which every aspect of their lives can be defined. I don’t think they’re fanatics, but, rather individuals who already lived in agreement with that value set. And on the other hand are those whose value system has a different set of values, who treat Objectivism as an approach to thinking rather than the definition of it. The latter definition is best expressed in Atlas Shrugged and We The Living; the former is best expressed in The Fountainhead and her philosophical essays. It’s a distinction in definition. I am, incidentally, of the latter persuasion—I consider myself an Objectivist, but disagree with Ayn Rand on the matter of values. (I, for example, reverse her logic—life is important because it is necessary to reason, rather than the other way around.)
Global warming ( caused by humans) is the new sky is falling. I remember y2k too. lol. people!
Y2K didn’t end in catastrophe, and lots and lots of time and money was put into stopping it. I don’t think, from this, you can draw the conclusion that it wouldn’t have ended in catastrophe.
And anyway, Y2K is completely unrelated to climate change. Are you trying to argue that, since everybody was worried about this thing-that-wasn’t-ultimately-much-of-a-problem, people worrying about something is evidence against it being a problem? That sounds like a terrible conclusion to make.
Please don’t feed the troll. Not even under a post on why banning eloquent trolls might be a bad idea.
There’s no particular reason to want to continue that metaphor more directly. Only mathematically precise metaphors have that kind of power.
Well, is it better to continue a vague analogy, or mix your metaphors and end up with something that reads oddly? (“Evaporative cooling” leading to an explosion? Isn’t freezing the relevant phase transition here?)
I’m just nitpicking language here, so maybe I should stop hijacking the thread. ;)
Evaporative cooling could lead to a nuclear explosion. Imagine uraneous ore, dissolved in a fluid medium, surrounded by a potential barrier, and heated. Non-uranium particles escape first, followed by lighter isotopes. The end result would be a tight concentration of the heaviest available isotope: critical mass.
As it approached critical, it would generate heat. There would be an equilibrium. In this case, the group itself becomes super stable—but its ideas become unhinged.
I really can’t think of any comparable system, except condensation.
An almost identical system actually happened in nature.
When I followed the link to the page, I watched the gif of a condensate cooling down; at a certain point, when enough of the ‘red’ (warmer) points had disappeared, the white column seemed to surge or ‘explode’ upwards.
So the point is that the idiots who are directly useless—make no useful contributions, have no ideas, spark nothing good—may be useful because they give shelter for others who want to raise controversial ideas?
I’d want to see a group not already mad that suffered for not having an idiot in their number before I believed it...
As a former member of a lot of different “christian”(the quotes are there for a reason) churches I can only say: you hit the nail on the head! Btw, I’m not a member of any church nowadays.
If you get rid of the old outliers, the group position will shift, and someone else will become the oddball. If you eject them too, you’re well on the way to becoming a Bose-Einstein condensate and, er, exploding.
The real problem is that some group-leaders really want their groups to become BECs, with them at the center of control. So they will do all that is possible to eject dissenting voices. Alas, I speak from experience here.
Maybe those cultists were so brainwashed that they no longer judged their ideas by reality, but reality by their ideas? So in their eyes, strictly speaking, the non-arrival of the space fleet did not count as evidence either way. Scary thought, I know.
Is suvine.com just being ironic after reading the stuff about trolling? If so, you really had me going there for a while.
In practice, such rules create echo chambers where any discussion that the moderators don’t agree with is silenced. It’s theoretically possible for a human being to be utterly impartial when deciding whether conversations are useful, but it requires near-superhuman patience and tolerance.
Quite a lot of the problem is that the category of ‘troll’ quickly expanded beyond its original meaning; in everyday Netspeak, it generally refers to “a person who persists in saying things I don’t wish to tolerate”.
well, it’s tone too.
e.g. say sauvine.com had said: “this is why i think the scientists who believe in global warming have formed a BEC...”
i bet people would downvote, but i doubt they would label them as a troll.
If you have one person around who is the famous Guy Who Disagrees With Everything, anyone with a more reasonable, more moderate disagreement won’t look like the sole nail sticking out.
Like the Devil’s advocate in the catholic church? He seems to have done his role admirably; according to Wikipedia:
Actually, Jehovah’s Witnesses and their predecessors have set one date after another since 1843, and are right now trying to figure out the next:
http://jwemployees.bravehost.com/JWInfo/1001.html
Wow is that sad.
As I recall that particular religion was originally created so that its founder could sell “blessed” wheat seed at a premium because he’d been sued so many times over false claims that his wheat produced a higher yield than the common variety (It was, actually the same variety and there was demonstrably no difference.)
He soon discovered that running a cult was more profitable.
Most commenters refer to cultists as if they are not them. Wrong. Understanding how the brain works—how it clings to belief—does not place one outside his own brain.
It is our emotional core of thought that leads to irrational attachment to belief. If machine intelligence can precipitate around a different “thought model” it may escape most(?) of the irrationality of its forerunners.
However, it will probably end up with a different set of irrationalities. We haven’t got any examples of a near-human intelligence that’s inherently rational, and I’d conjecture it’s unlikely that our first few attempts will succeed in this.
I think a better way to use Kuhn’s idea to interpret the process and effects of the skeptics leaving is through an ontological reflection on the group’s essence: We can say that the group or discipline really started existing in a relevant sense (living out its core beliefs, “getting any real work done”) only after the original skeptics were gone. Perhaps the discipline’s increasingly inhospitable environment to internal dissenters is a more relevant place to draw the line of exactly when that discipline “[abandoned] the requirement of outside accessibility” than the group’s own idea of when it started existing. Put another way, the skeptics, while technically belonging to what I’m considering the not-yet-self-realized group were the outsiders whose access was, in a sense, curtailed by a later change in tone of discussion, etc
Nastunya
There’s no particular reason why you need to have a detailed knowledge of the psychological literature for fanatics to try to chase moderates out of a group. That particular piece of knowledge, that if you shout down and away anybody who disagrees with you then you are in control, seems built in to human understanding of politics.
Great post! I’d guess the effect is smaller than cognitive dissonance, but it definitely seems real. The tricky part, as you point out at the end, is how to apply it. But I don’t think it’s that hard to separate trolls from dissenters.
Hrm… I’m not sure having just “that one guy that’s the Known Contrary Guy” would have the desired effect.
Maybe I’m completely wrong, but by personal expectation would be that having just one would instead be a convinient “bad example”… actually resulting in a reduction of dissent among the rest by having people partly wanting to, in a sense, avoid being like that guy. Having more than one may be better since then it wouldn’t be “that one crazy guy that you don’t want to be like” but “just some people that disagree.”
Or am I completely and utterly wrong on the psychology of this?
Adirian: ”...I consider myself an Objectivist, but disagree with Ayn Rand...”
Do you realize how ridiculous that sounds?
Psy-Kosh, having that some-people-that-disagree dynamic is clearly preferable to having that-one-crazy-guy-you-don’t-want-to-be-like setup, but I’d expect that many such dynamics tend to start with having just one strong dissenter, don’t you think?
Also, I can imagine another difficulty for that Known Contrary Guy: his input may unfortunately be less well received by others not just because of the clash of the views but also, and I fear even more so, because of ad-hominem-type dismissals by the rest of the group.
Adirian: ”...I consider myself an Objectivist, but disagree with Ayn Rand...”
Do you realize how ridiculous that sounds? Would that be more or less ridiculous that saying “I consider myself an Objectivist; I agree with Ayn Rand”?
Let’s make sure we’re all discussing the same thing here: I’d say there’s an appreciable difference between ‘The Guy Who Disagrees With Everything’ and your average troll. Note the word ‘articulate’ in the original post. A clued-up Devil’s Advocate is an important way of keeping yourself honest. An annoying idiot (even just the one) is just a quick way of bringing a debate to its knees. Unfortunately, I fear there’s no way to quickly discern which is which....
The metaphor can be made mathematically precise if we first make the analogy between human decision-making and optimization methods like simulated annealing and genetic algorithms. These optimization methods look for a locally optimal solution, but add some sort of “noise” term to try to find a globally optimal solution. So if we suppose that someone who wants to stay in his own local minimum has a lower “noise” temperature than someone who is open-minded, then the metaphor starts to make sense on a much more profound level.
ME, such an analogy is deeply inappropriate. Simulated annealing and genetic algorithms are rather nitwit, inefficient algorithms that AI programmers use because current programs can’t abstract and compress and leap across the search space the way that humans do. Rationality is not about random noise and selection, or it would take humans millions of years to think anything significantly complex.
Eliezer, you are right, what I really meant to say was, once a person finds a locally optimal solution using whatever algorithm, they then have a threshold for changing their mind, and it is that threshold that is similar to temperature.
Just saw this.
Have a look at the first letter. Classic example of this post’s subject. Newcastle had a good fanbase from all around England a few years ago, but “as soon as Keegan was gone [world not ending as predicted] so were the neutrals [skeptical followers] owing to the fact that they were never really accepted as ‘real’ Newcastle fans on account of they weren’t Geordies. They don’t have any ‘outsider’ fans to keep them calm and give them a sense of reality.” I read that and had to stop myself saying “Evaporative Cooling of Group Beliefs!” out loud in the office....
I’d already thought of this myself before I saw the article. Not only is Evaporative Cooling in evidence at Newcastle United, they are caught in a Happy Death Spiral. For the philistines out there, Newcastle United Football Club recently reappointed Kevin Keegan as manager. Keegan was a star as a player for Newcastle thirty years ago, and had a spell as a manager about ten years back. They played brilliant attacking football under him, but Keegan’s tactical naivete saw them throw away a commanding lead to lose the championship. He left, and it’s been downhill ever since.
In the intervening years the team has struggled to show any real class, and rose-tinted hindsight means the fans still call Keegan ‘The Messiah’. His recent second return has been greeted as if it were actually the second coming. He and the supporters have had a big old media love-in, reinforcing one another’s sense of righteousness, and the club is now unnaturally confident about its prospects. When Keegan’s faults are pointed out to fans, the stock reply is ‘who cares, the Messiah’s back, we’re going to see some exciting football!’
Hope this makes sense in the states....
Typo/missing word alert:
When an unusually fast atom escapes, it takes with an unusually large amount of kinetic energy, and the average energy decreases.
Should be: “it takes with IT an unusually...”
Prophecy Fail: What happens to a doomsday cult when the world doesn’t end? Vaughan Bell, Slate.com, May 20, 2011:
[...]
When the reasonable people start leaving, the center of mass of the group shifts to wack space.
As for the comments on moderation, the solutions described are overly constrained by the pitiful state of web forum software.
Back in the early 90s, I was on an email list with elaborate filtering mechanisms, including collaborative filtering. Usenet had the magic of killfile filtering. Today, we’re lucky to get “thumbs up”, “thumbs down”, and “ignore”. It’s just pitiful. It’s like grubbing around in the Dark Ages, after living in Imperial Rome.
This reminds me of a thought experiment where perfect averages skew towards one extreme when you eliminate one radical. It makes mathematical sense. Apparently, a village can come extremely close to guessing the weight of an ox by taking all of their guesses and averaging them, even if some indivuals are radically under or over. But change the scope and you may change the median’s accuracy (or sanity, as the articles metaphor) Lock the village in a room with no clocks or windows and wait until 6 am, just before any hint of sunlight, then show them the sky and take their guesses. The radicals that guess ‘midnight’ won’t change, but the ones who would have said ‘noon’ will, so your average would slide to ever more inaccurately early. Just a thought model though, I’ve never read this precise test being done.
I’m not certain that that would be wrong. From the observations they have access to, they have no way of telling the difference between different points in the night.
If they can see the moon, however, this changes. Similarly, if they can wait an hour and see what changes. Similarly if they can see the stars, and know roughly what month it is. Because it’s not just the most extreme people who’ll update their beliefs.
(By the way; the averaging thing only works if the individuals don’t communicate about their guesses, which means that this isn’t in any way an accurate representation of the behaviour described in this article!)
“And what would be the analogy to collapsing to form a Bose-Einstein condensate?”
...All of them moving into the same compound and acquiring an arsenal seems about right, particularly when you consider the increased chance of violent explosion.
```No thermodynamic analogy here, unless someone develops a nuclear weapon that explodes when it gets cold```
Challenge accepted.
You should just flip it around and call it evaporative *heating.* Human groups work exactly opposite to hot atoms; it is the *cooler* ones who find it easier to escape. Then those who are left get hotter and hotter until they explode.
Typo?
“Early studiers of cults were surprised to discover than when cults receive a major shock”
Should say ”...to discover that when cults...”
Yep, fixed.