RSS

Tribalism

TagLast edit: 19 Aug 2020 21:39 UTC by Ruby

Tribalism or Coalitional Instincts is closely connected to the concept of in/​out-groups. Coalitional instincts drive humans to act in ways which cause them join, support, defend, and maintain their membership in various coalitions that are defined by sharing a common identity. An illustrative example can be found in A Fable of Science and Politics.

See also: Blues and Greens, Groupthink, Motivated Reasoning, Social and Cultural Dynamics, Social Reality.

The primary function that drove the evolution of coalitions is the amplification of the power of its members in conflicts with non-members. This function explains a number of otherwise puzzling phenomena. For example, ancestrally, if you had no coalition you were nakedly at the mercy of everyone else, so the instinct to belong to a coalition has urgency, preexisting and superseding any policy-driven basis for membership. This is why group beliefs are free to be so weird. [...]
… to earn membership in a group you must send signals that clearly indicate that you differentially support it, compared to rival groups. Hence, optimal weighting of beliefs and communications in the individual mind will make it feel good to think and express content conforming to and flattering to one’s group’s shared beliefs and to attack and misrepresent rival groups. The more biased away from neutral truth, the better the communication functions to affirm coalitional identity, generating polarization in excess of actual policy disagreements. Communications of practical and functional truths are generally useless as differential signals, because any honest person might say them regardless of coalitional loyalty. In contrast, unusual, exaggerated beliefs [...] are unlikely to be said except as expressive of identity, because there is no external reality to motivate nonmembers to speak absurdities.
-- John Tooby, “Coalitional Instincts

Humans interact in dense social networks, and this poses a problem for bystanders when conflicts arise: which side, if any, to support. Choosing sides is a difficult strategic problem because the outcome of a conflict critically depends on which side other bystanders support. One strategy is siding with the higher status disputant, which can allow bystanders to coordinate with one another to take the same side, reducing fighting costs. However, this strategy carries the cost of empowering high-status individuals to exploit others. A second possible strategy is choosing sides based on preexisting relationships. This strategy balances power but carries another cost: Bystanders choose different sides, and this discoordination causes escalated conflicts and high fighting costs. We propose that moral cognition is designed to manage both of these problems by implementing a dynamic coordination strategy in which bystanders coordinate side-taking based on a public signal derived from disputants’ actions rather than their identities. By focusing on disputants’ actions, bystanders can dynamically change which individuals they support across different disputes, simultaneously solving the problems of coordination and exploitation.
-- Peter DeScioli & Robert Kurzban, ” A Solution to the Mysteries of Morality

Poli­tics is the Mind-Killer

Eliezer Yudkowsky18 Feb 2007 21:23 UTC
154 points
232 comments2 min readLW link

Eth­nic Ten­sion And Mean­ingless Arguments

Scott Alexander5 Nov 2014 3:38 UTC
41 points
5 comments22 min readLW link

Belief as Attire

Eliezer Yudkowsky2 Aug 2007 17:13 UTC
77 points
104 comments2 min readLW link

Ap­plause Lights

Eliezer Yudkowsky11 Sep 2007 18:31 UTC
181 points
93 comments2 min readLW link

I Can Tol­er­ate Any­thing Ex­cept The Outgroup

Scott Alexander2 Sep 2017 8:22 UTC
36 points
1 comment28 min readLW link

A Fable of Science and Politics

Eliezer Yudkowsky23 Dec 2006 4:50 UTC
225 points
102 comments5 min readLW link

Pro­fess­ing and Cheering

Eliezer Yudkowsky2 Aug 2007 7:20 UTC
71 points
44 comments2 min readLW link

Dun­bar’s Function

Eliezer Yudkowsky31 Dec 2008 2:26 UTC
54 points
65 comments6 min readLW link

28 so­cial psy­chol­ogy stud­ies from *Ex­per­i­ments With Peo­ple* (Frey & Gregg, 2017)

Yuxi_Liu16 Jun 2019 2:23 UTC
12 points
1 comment26 min readLW link

Anti-trib­al­ism and pos­i­tive men­tal health as high-value cause areas

Kaj_Sotala2 Aug 2018 8:30 UTC
24 points
5 comments2 min readLW link
(kajsotala.fi)

Fight the Power

Jacob Falkovich22 Jun 2020 2:19 UTC
23 points
3 comments7 min readLW link

Every Cause Wants To Be A Cult

Eliezer Yudkowsky12 Dec 2007 3:04 UTC
79 points
33 comments3 min readLW link

False Laughter

Eliezer Yudkowsky22 Dec 2007 6:03 UTC
39 points
66 comments3 min readLW link

Plan to Be Lucky

Jacob Falkovich16 Jan 2018 1:00 UTC
11 points
3 comments1 min readLW link
(putanumonit.com)

[Book Re­view] Destiny Disrupted

lsusr21 Mar 2021 7:09 UTC
54 points
4 comments9 min readLW link

Im­prov­ing lo­cal gov­er­nance in frag­ile states—prac­ti­cal les­sons from the field

rockthecasbah29 Jul 2020 1:54 UTC
16 points
3 comments6 min readLW link

How to Not Lose an Argument

Scott Alexander19 Mar 2009 1:07 UTC
134 points
416 comments6 min readLW link

Ci­vil­ity Is Never Neutral

ozymandias22 Nov 2017 16:54 UTC
54 points
15 comments4 min readLW link

Science as Attire

Eliezer Yudkowsky23 Aug 2007 5:10 UTC
101 points
88 comments2 min readLW link

Of Ex­clu­sion­ary Speech and Gen­der Politics

Eliezer Yudkowsky21 Jul 2009 7:22 UTC
87 points
669 comments5 min readLW link

Have no heroes, and no villains

PhilGoetz7 Nov 2010 21:15 UTC
114 points
74 comments1 min readLW link

Wron­gol­ogy 101

sarahconstantin25 Apr 2018 0:00 UTC
31 points
5 comments7 min readLW link
(srconstantin.wordpress.com)

Miss­ing the Trees for the Forest

Scott Alexander22 Jul 2009 3:23 UTC
82 points
159 comments7 min readLW link

Stop Vot­ing For Nincompoops

Eliezer Yudkowsky2 Jan 2008 18:00 UTC
61 points
91 comments8 min readLW link

Poli­tics is hard mode

Rob Bensinger21 Jul 2014 22:14 UTC
48 points
107 comments6 min readLW link

Only You Can Prevent Your Mind From Get­ting Killed By Politics

ChrisHallquist26 Oct 2013 13:59 UTC
58 points
144 comments5 min readLW link

Why ar­tifi­cial op­ti­mism?

jessicata15 Jul 2019 21:41 UTC
63 points
29 comments4 min readLW link
(unstableontology.com)

Do you fear the rock or the hard place?

Ruby20 Jul 2019 22:01 UTC
56 points
10 comments5 min readLW link2 nominations3 reviews

Memetic Tribalism

[deleted]14 Feb 2013 3:03 UTC
61 points
62 comments4 min readLW link

Anti-trib­al­ism and pos­i­tive men­tal health as high-value cause areas

Kaj_Sotala17 Oct 2017 10:20 UTC
12 points
14 comments2 min readLW link
(kajsotala.fi)

Sta­tus for sta­tus sake is a fact of poli­ti­cal life

rockthecasbah18 Aug 2020 22:06 UTC
10 points
11 comments1 min readLW link

Jour­nal ar­ti­cle about poli­tics and mindkilling

CronoDAS7 Sep 2011 7:46 UTC
41 points
2 comments1 min readLW link

Schism Begets Schism

Davis_Kingsley10 Jul 2019 3:09 UTC
24 points
25 comments3 min readLW link

Man­u­fac­tur­ing prejudice

PhilGoetz3 Apr 2011 17:26 UTC
37 points
73 comments1 min readLW link

Fac­tions, in­equal­ity, and so­cial justice

John_Maxwell3 Dec 2012 19:37 UTC
43 points
173 comments6 min readLW link

In defense of com­mon-sense tribalism

toonalfrink2 Nov 2017 8:43 UTC
10 points
5 comments2 min readLW link

Tak­ing the Out­group Seriously

Davis_Kingsley16 Feb 2020 13:23 UTC
21 points
8 comments2 min readLW link

Sup­port­ing the un­der­dog is ex­plained by Han­son’s Near/​Far distinction

Roko5 Apr 2009 20:22 UTC
27 points
27 comments4 min readLW link

The Think­ing Lad­der—Wait But Why

Liron29 Sep 2019 18:51 UTC
19 points
1 comment1 min readLW link
(waitbutwhy.com)

The Sword of Good

Eliezer Yudkowsky3 Sep 2009 0:53 UTC
105 points
300 comments2 min readLW link

Notes on Loyalty

David_Gross15 Nov 2020 19:30 UTC
17 points
2 comments8 min readLW link

That Thing That Happened

[deleted]18 Dec 2012 12:29 UTC
32 points
85 comments2 min readLW link

The bias shield

PhilGoetz31 Dec 2011 17:44 UTC
29 points
65 comments6 min readLW link

Mini thoughts on mintheism

CraigMichael11 Jan 2021 5:14 UTC
10 points
1 comment7 min readLW link
No comments.