What is Rationality?

This ar­ti­cle is an at­tempt to sum­ma­rize ba­sic ma­te­rial, and thus prob­a­bly won’t have any­thing new for the ex­pe­rienced crowd.

Re­lated: 11 Core Ra­tion­al­ist Skills, What is Bayesi­anism?

Less Wrong is a blog de­voted to re­fin­ing the art of hu­man ra­tio­nal­ity, but what is ra­tio­nal­ity? Ra­tion­al­ity is un­like any sub­ject I stud­ied at school or uni­ver­sity, and it is prob­a­bly the case that the syn­the­sis of sub­jects and ideas here on Less Wrong is fairly unique.

Fun­da­men­tally, ra­tio­nal­ity is the study of gen­eral meth­ods for good de­ci­sion-mak­ing, es­pe­cially where the de­ci­sion is hard to get right. When an in­di­vi­d­ual is con­sid­er­ing whether to get a cry­on­ics policy, or when a coun­try is try­ing to work out what to do about global warm­ing, one is within the realm of de­ci­sion-mak­ing that we can use ra­tio­nal­ity to im­prove. Peo­ple do badly on hard de­ci­sion prob­lems for a va­ri­ety of rea­sons, in­clud­ing: that they are not born with the abil­ity to deal with the sci­en­tific knowl­edge and com­plex sys­tems that our mod­ern world runs on, that they haven’t been warned that they should think crit­i­cally about their own rea­son­ing, that they be­long to groups that col­lec­tively hold faulty be­liefs, and that their emo­tions and bi­ases skew their rea­son­ing pro­cess.

  • Ra­tion­al­ity is the abil­ity to do well on hard de­ci­sion prob­lems.

Another cen­tral theme of ra­tio­nal­ity is truth-seek­ing. Truth-seek­ing is of­ten used as an aid to de­ci­sion-mak­ing: if you’re try­ing to de­cide whether to get a cry­on­ics policy, you might want to find out whether the tech­nol­ogy has any good ev­i­dence sug­gest­ing that it might work. We can make good de­ci­sions by get­ting an ac­cu­rate es­ti­mate of the rele­vant facts and pa­ram­e­ters, and then choos­ing the best op­tion ac­cord­ing to our un­der­stand­ing of things; if our un­der­stand­ing is more ac­cu­rate, this will tend to work bet­ter.

  • Ra­tion­al­ity is also the art of how to sys­tem­at­i­cally come to know what is true.

Often, the pro­cesses of truth-seek­ing and de­ci­sion-mak­ing, both on the in­di­vi­d­ual level and the group level are sub­ject to bi­ases: sys­tem­atic failures to get to the truth or to make good de­ci­sions. Bi­ases in in­di­vi­d­ual hu­mans are an ex­tremely se­ri­ous prob­lem—most peo­ple make im­por­tant life-de­ci­sions with­out even re­al­iz­ing the ex­tent and sever­ity of the cog­ni­tive bi­ases they were born with. There­fore ra­tio­nal thought re­quires a good deal of crit­i­cal think­ing—an­a­lyz­ing and re­flect­ing on your own thought pro­cesses in or­der to iron out the many flaws they con­tain. Group dy­nam­ics can in­tro­duce mechanisms of ir­ra­tional­ity above and be­yond the in­di­vi­d­ual bi­ases and failings of mem­bers of the group, and of­ten good de­ci­sion-mak­ing in groups is most severely ham­pered by flawed so­cial episte­mol­ogy. An acute ex­am­ple of this phe­nomenon is The Pope tel­ling HIV in­fested Africa to stop us­ing con­doms; a so­cial phe­nomenon (re­li­gion) was re­spon­si­ble for a failure to make good de­ci­sions.

Per­haps the best way to un­der­stand ra­tio­nal­ity is to see some tech­niques that are used, and some ex­am­ples of its use.

Ra­tion­al­ity tech­niques and top­ics in­clude:

  • Heuris­tics and bi­ases—Per­haps the key in­sight that started the Over­com­ing Bias and Less Wrong blogs was the mount­ing case from ex­per­i­men­tal psy­chol­o­gists that real hu­man de­ci­sion-mak­ing and Belief for­ma­tion is far from the ideals of eco­nomic ra­tio­nal­ity and Bayesian prob­a­bil­ity. A key refer­ence on the sub­ject is Judg­ment un­der Uncer­tainty: Heuris­tics and Bi­ases. For those who pre­fer the web, Less Wrong has a set of ar­ti­cles tagged “stan­dard bi­ases”. If you know your own flaws, you may be able to cor­rect for them—this is known as de­bi­as­ing.

  • Evolu­tion­ary psy­chol­ogy and Evolu­tion­ary the­ory—In his best­sel­ling book Fooled by Ran­dom­ness, Nas­sim Taleb writes “Our minds are not quite de­signed to un­der­stand how the world works, but, rather, to get out of trou­ble rapidly and have progeny”. Un­der­stand­ing that the pro­cess that pro­duced you cared only about in­clu­sive ge­netic fit­ness in the an­ces­tral en­vi­ron­ment, rather than your welfare or abil­ity to be­lieve the truth can help to iden­tify and iron out flaws in your de­ci­sion-mak­ing. There is a good se­quence on evolu­tion on Less Wrong. Per­haps the most im­por­tant piece of work on the im­pli­ca­tions of evolu­tion­ary the­ory for de­ci­sion-mak­ing and ra­tio­nal­ity is Bostrom and Sand­berg’s Wis­dom of Na­ture; al­though it pur­port­edly aims at as­sess­ing hu­man en­hance­ment op­tions, the style of rea­son­ing is highly ap­pli­ca­ble to think­ing about how to deal with the mixed bless­ings that evolu­tion put in­side our skulls.

  • Defeat­ing mo­ti­vated cog­ni­tion—Many spe­cific in­stances and types of bi­ased rea­son­ing are prob­a­bly cre­ated by the same set of sources, of­ten pro­cesses deeply in­ter­twined with our evolved psy­chol­ogy. The most per­ni­cious of these “sources of bi­ased rea­son­ing” is mo­ti­vated cog­ni­tion, the king of bi­ases. The hu­man mind seems to have a way of short-cir­cuit­ing it­self whereby happy emo­tions come when you vi­su­al­ize an out­come that is good for you, and this causes you to search for ar­gu­ments that sup­port the con­clu­sion that that good out­come will oc­cur. This kind of “bot­tom line rea­son­ing” is in­sidious, and de­creas­ing the ex­tent to which you suffer from it is a key way to in­crease your ra­tio­nal­ity. Leav­ing a line of re­treat is one good an­ti­dote. There is a whole se­quence on how to ac­tu­ally change your mind that at­tempts to beat this prob­lem.

  • Tech­niques of an­a­lytic philos­o­phyAn­a­lytic philoso­phers have spent a long time hon­ing tech­niques to pro­mote bet­ter think­ing, es­pe­cially about con­cep­tu­ally con­fus­ing sub­jects. They will of­ten be very care­ful to ex­plic­itly define key terms they use, and be open and up­front about terms that they take as prim­i­tive, as well as be­ing clear about the struc­ture of their ar­gu­ments.

  • Bayesian statis­tics and the Bayesian mind­set—Covered ex­pertly in the ar­ti­cle “What is Bayesi­anism?”—briefly, the idea is that the be­liefs of an ideal ra­tio­nal agent are formed by a pro­cess of for­mu­lat­ing hy­pothe­ses, as­sign­ing prior cre­dence to each, and then us­ing Bayes’ the­o­rem to work back­wards from the data to work out how likely var­i­ous hy­pothe­ses are, given the data. In cases where there is “over­whelming ev­i­dence”, the stric­tures of Bayes’ the­o­rem are un­nec­es­sary: it will be ob­vi­ous which hy­poth­e­sis is true. For ex­am­ple, you do not need Bayes’ the­o­rem to de­duce that Gary Kas­parov would beat your grand­mother at chess. Re­lated to this are var­i­ous er­rors and lies that can arise from bad (or de­liber­ately mis­lead­ing) statis­ti­cal analy­ses.

  • Microe­co­nomic ways of think­ingMicroe­co­nomics mod­els ra­tio­nal agents as aiming to make good per­sonal choices sub­ject to re­source con­straints. Von Neu­mann and Mor­gen­stern proved an im­por­tant the­o­rem stat­ing that the prefer­ences of a “ra­tio­nal” agent can be ex­pressed as a util­ity func­tion. Other re­searchers in microe­co­nomics made sig­nifi­cant ad­vances by con­sid­er­ing the marginal util­ity of ac­tions—how much bet­ter do things get if one shifts one dol­lar of one’s ex­pen­di­ture from buy­ing ice-cream to buy­ing clothes? The no­tion of op­por­tu­nity cost is a clas­sic ex­am­ple of a microe­co­nomic con­cept. Value of in­for­ma­tion is an­other. In re­cent times, microe­co­nomics has taken hu­man psy­chol­ogy into ac­count more, lead­ing to for­mal the­o­ries of bound­edly ra­tio­nal and ir­ra­tional agents, such as prospect the­ory.

  • Game the­ory and sig­nal­ing games—a sub-field of microe­co­nomics so im­por­tant that it de­serves a sep­a­rate men­tion, game the­ory an­a­lyzes the in­ter­ac­tions be­tween com­pet­ing ra­tio­nal agents in a for­mal way. The key in­tu­ition pump is the pris­oner’s dilemma, but I think that the the for­mal anal­y­sis of sig­nal­ing games is even more im­por­tant for ra­tio­nal­ity, as sig­nal­ing games ex­plain so much about why peo­ple ver­bally en­dorse state­ments (the state­ment is there as a sig­nal, not as an in­di­ca­tor of ra­tio­nal be­lief). Robin Han­son of Over­com­ing Bias has posted many times on how the sub­con­scious hu­man de­sire to sig­nal af­fects our de­ci­sion-mak­ing in weird ways.