Some Remarks on the Nature of Political Conflict

[Part of the de­bate around: Con­flict Vs. Mis­take]

[Crit­i­ciz­ing ar­ti­cles like: In defence of con­flict the­ory, Con­ser­va­tives As Mo­ral Mu­tants (part of me feels like the link is self-trig­ger-warn­ing, but I guess I will just warn you that this is not a clever at­ten­tion-grab­bing ti­tle, the link means ex­actly what it says and ar­gues it at some length)]

[Re­lated to: Know­ing About Bi­ases Can Hurt Peo­ple, Would Your Real Prefer­ences Please Stand Up?, The Cow­pox of Doubt, Guided By The Beauty Of Our Weapons]

[Epistemic effort: I thought of this ar­gu­ment and was so pleased by my own clev­er­ness that I de­cided to post it.]

[Note: I have a nag­ging feel­ing I’ve spent a thou­sand words spel­ling out some­thing com­pletely ob­vi­ous. Still, I hope there’s value in ac­tu­ally spel­ling it out.]

There has been a flurry of dis­cus­sion around the na­ture of poli­ti­cal con­flict in the ra­tio­nal­ity move­ment for the last five months, sparked by a blog post by Scott Alexan­der on his blog Slate Star Codex mak­ing a di­chotomy be­tween mis­take the­o­rists who think their poli­ti­cal op­po­nents are mis­taken on fac­tual policy ques­tions and con­flict the­o­rists who think their poli­ti­cal op­po­nents are in­nately evil. There have been a lot of good ar­ti­cles on the sub­ject on ev­ery side and on both the ob­ject-level and the meta-level (well, on both the meta-level and the meta-meta-level), but also many bad ones rest­ing on mis­takes (I know, I am show­ing my side here).

One class of pro-con­flict-the­ory ar­gu­ments that bother me a lot goes like this:

Mis­take the­ory can’t be the cor­rect wor­ld­view be­cause, for ex­am­ple, it’s his­tor­i­cally doc­u­mented that to­bacco com­pa­nies hired sci­en­tists to spread mis­in­for­ma­tion about whether smok­ing causes can­cer in­stead of think­ing about it in a ra­tio­nal way.

Other his­tor­i­cal case stud­ies used in­clude the rise of liberal democ­racy, the abo­li­tion of slav­ery, giv­ing women the right to vote, the end of seg­re­ga­tion, etc.

A sci­en­tific the­ory that is of­ten used in this kind of ar­gu­ment is Jonathan Haidt’s work on poli­ti­cal psy­chol­ogy. Jonathan Haidt, with his co-con­spir­a­tor Jesse Gra­ham, cre­ated moral foun­da­tions the­ory, ac­cord­ing to which moral­ity is di­vided into five foun­da­tions:

  • Care: cher­ish­ing and pro­tect­ing oth­ers; op­po­site of harm

  • Fair­ness or pro­por­tion­al­ity: ren­der­ing jus­tice ac­cord­ing to shared rules; op­po­site of cheating

  • Loy­alty or in­group: stand­ing with your group, fam­ily, na­tion; op­po­site of betrayal

  • Author­ity or re­spect: sub­mit­ting to tra­di­tion and le­gi­t­i­mate au­thor­ity; op­po­site of subversion

  • Sanc­tity or pu­rity: ab­hor­rence for dis­gust­ing things, foods, ac­tions; op­po­site of degradation

A shock­ing and un­ex­pected dis­cov­ery of moral foun­da­tions the­ory is that con­ser­va­tives value Loy­alty, Author­ity, and Sanc­tity more than liber­als do. (Liber­als also value Care and Fair­ness more than con­ser­va­tives do, but this effect is mag­ni­tudes smaller than the other one.) Some con­flict the­o­rists, both liberal and con­ser­va­tive, have seized over this to claim con­flict the­ory is cor­rect and those darned Blues are moral mu­tants who can’t listen.

This is the pop­u­lar un­der­stand­ing of moral foun­da­tions the­ory, any­way. In re­al­ity, this is only plu­ral­ism, the fourth claim of moral foun­da­tions the­ory. The four claims of moral foun­da­tions the­ory are¹:

  1. Na­tivism: There is a “first draft” of the moral mind

  2. Cul­tural learn­ing: The first draft gets ed­ited dur­ing de­vel­op­ment within a par­tic­u­lar culture

  3. In­tu­ition­ism: In­tu­itions come first, strate­gic rea­son­ing second

  4. Plu­ral­ism: There were many re­cur­rent so­cial challenges in our evolu­tion­ary his­tory, so there are many moral foundations

The third claim is in­tu­ition­ism. So­cial in­tu­ition­ism, as a psy­cholog­i­cal the­ory, is older than the moral plu­ral­ism that is of­ten equated with moral foun­da­tions the­ory in pop sci­ence. Jonathan Haidt wrote about it in 2001, years be­fore he wrote about moral plu­ral­ism. So­cial in­tu­ition­ism is a model that pro­poses that moral po­si­tions and judg­ments are²:

  1. pri­mar­ily in­tu­itive (“in­tu­itions come first”)

  2. ra­tio­nal­ized, jus­tified, or oth­er­wise ex­plained af­ter the fact

  3. taken mainly to in­fluence other peo­ple, and are

  4. of­ten in­fluenced and some­times changed by dis­cussing such po­si­tions with others

If you look at what you think is moral foun­da­tions the­ory (but is ac­tu­ally only moral plu­ral­ism with­out the back­ground of so­cial in­tu­ition­ism that is nec­es­sary to fully un­der­stand it), you might get the im­pres­sion that peo­ple with differ­ent moral in­tu­itions than you con­sciously do so. The re­al­ity is much much worse than that. Let’s say Pro-Skub peo­ple value Skub and Anti-Skub peo­ple don’t. Pro-Skub Peo­ple don’t know that their moral po­si­tions and judg­ments are pri­mar­ily in­tu­itive. They don’t know that in­tu­itions come first. They ra­tio­nal­ize it, jus­tify it, and oth­er­wise ex­plain it af­ter the fact. Similarly, Anti-Skub peo­ple will ra­tio­nal­ize their not valu­ing of Skub, jus­tify it, and oth­er­wise ex­plain it af­ter the fact.

This is very differ­ent from what pop­u­lar mi­s­un­der­stand­ing sug­gest ! Pop­u­lar mi­s­un­der­stand­ing sug­gest that you can trust your brain to be cor­rect about the value of Skub, given the only rea­son that your op­po­nents do/​don’t value Skub is that they have differ­ent ter­mi­nal val­ues than you. In re­al­ity, so­cial in­tu­ition­ism say that your brain is bro­ken, is ra­tio­nal­iz­ing its rea­sons to value or not value Skub, and your op­po­nents’ brain are also bro­ken in the same way. So­cial in­tu­ition­ism say that you can’t trust your bro­ken brain.

Ra­tion­al­iza­tion is, of course, not limited to moral po­si­tions and judg­ments. It and its bud­dies con­fir­ma­tion bias and mo­ti­vated cog­ni­tion wan­der ev­ery­where. It’s not a co­in­ci­dence that Mo­ti­vated Stop­ping and Mo­ti­vated Con­tinu­a­tion speci­fi­cally use the ex­am­ple of to­bacco sci­ence. But you—yes, you—aren’t im­mune from ra­tio­nal­iza­tion, con­fir­ma­tion bias, or mo­ti­vated cog­ni­tion. You can’t trust your brain to not do it. You can’t trust your brain to not be the next con­flict the­o­rist case study.

Luck­ily, the fourth tenet of so­cial in­tu­ition­ism is that moral po­si­tions and judg­ments are of­ten in­fluenced and some­times changed by dis­cussing such po­si­tions with oth­ers. Your best way to not let your brain be the next con­flict the­o­rist case study is to de­liber­ately ex­ploit this as best you can. To not let your brain be the next con­flict the­o­rist case study, de­bate is es­sen­tial. We all bring differ­ent forms of ex­per­tise to the table, and once we all un­der­stand the whole situ­a­tion, we can use wis­dom-of-crowds to con­verge on the cor­rect an­swer. Who wins on any par­tic­u­lar is­sue is less im­por­tant cre­at­ing an en­vi­ron­ment where your brain won’t be the next con­flict the­o­rist case study.

What’s the worst thing you could do in your quest to not let your brain be the next con­flict the­o­rist case study ? Prob­a­bly treat ev­ery­thing as war and view­ing de­bate as hav­ing a minor clar­ify­ing role at best. That’s the best way for ra­tio­nal­iza­tion, con­fir­ma­tion bias, mo­ti­vated cog­ni­tion, and self-serv­ing bias to creep in. This is how most of the con­flict the­o­rist case stud­ies thought.

Mis­take the­ory is the cor­rect wor­ld­view pre­cisely be­cause to­bacco com­pa­nies hired sci­en­tists to spread mis­in­for­ma­tion about whether smok­ing causes can­cer in­stead of think­ing about it in a ra­tio­nal way.

¹: Gra­ham, Jesse and Haidt, Jonathan and Koleva, Sena and Motyl, Matt and Iyer, Ravi and Wo­j­cik, Sean P. and Ditto, Peter H., Mo­ral Foun­da­tions The­ory: The Prag­matic Val­idity of Mo­ral Plu­ral­ism (Novem­ber 28, 2012). Ad­vances in Ex­per­i­men­tal So­cial Psy­chol­ogy, Forth­com­ing. Available at SSRN: https://​​ssrn.com/​​ab­stract=2184440

²: Haidt, Jonathan (2012). The Righ­teous Mind: Why Good Peo­ple Are Di­vided by Poli­tics and Reli­gion. Pan­theon. pp. 913 Kin­dle ed. ISBN 978-0307377906.

No nominations.
No reviews.