An Introduction To Rationality

This ar­ti­cle is an at­tempt to sum­ma­rize ba­sic ma­te­rial, and thus prob­a­bly won’t have any­thing new for the hard core post­ing crowd. It’d be in­ter­est­ing to know whether you think there’s any­thing es­sen­tial I missed, though.

Summary

We have a men­tal model of the world that we call our be­liefs. This world does not always re­flect re­al­ity very well as our per­cep­tions dis­tort the in­for­ma­tion com­ing from our senses. The same is true of our de­sires such that we do not ac­cu­rately know why we de­sire some­thing. It may also be true for other parts of our sub­con­scious. We, as con­scious be­ings, can no­tice when there is a dis­crep­ancy and try to cor­rect for it. This is known as Epistemic ra­tio­nal­ity. Seen this way sci­ence is a for­mal­ised ver­sion of epistemic ra­tio­nal­ity. In­stru­men­tal ra­tio­nal­ity is the act of do­ing what we value, what­ever that may be.

Both forms of ra­tio­nal­ity can be learnt and it is the aim of this doc­u­ment to con­vince you that it is both a good idea and worth your time.

Hill Walker’s Analogy

Real­ity is the phys­i­cal hills, streams, roads and land­marks (the ter­ri­tory) and our be­liefs are the map we use to nav­i­gate them. We know that the map is not the same thing as the ter­ri­tory but we hope it is ac­cu­rate enough to un­der­stand things about it all the same. Er­rors in the map can eas­ily lead to us do­ing the wrong thing.

The Lens That Distorts

We see the world through a filter or lens called per­cep­tion. This is nec­es­sary; there is too much in­for­ma­tion in the world to map fully but it is very im­por­tant to quickly recog­nise cer­tain things. For ex­am­ple it is use­ful to recog­nise that a lion is about to at­tack you.

To go back to our hill walker’s anal­ogy: we are quicker nav­i­gat­ing with a hill walker’s map than an aerial pho­to­graph. This is be­cause the map has been filtered to only show what is com­monly needed by walk­ers. Too much in­for­ma­tion can slow our judge­ment or con­fuse us. On the other hand in­cor­rect in­for­ma­tion or a lack of in­for­ma­tion is just as bad. If when out walk­ing we are told to turn left at the sec­ond stream yet the hill has more streams than the map then we will get lost. The map was not ad­e­quate in this situ­a­tion.

In the same way our own maps can be wrong caused by er­rors in our per­cep­tual filter. Op­ti­cal illu­sions are a nice ex­am­ple of such a phe­nomenon where there is a dis­tor­tion be­tween map and ter­ri­tory.

An Example

Light from the sun bounces off our shoelaces, which are un­tied, and hits our eye (re­al­ity). Th­ese sig­nals get per­ceived (the lens) as an un­tied shoelace and thus we have the be­lief (map) that our shoelaces are un­tied. In this case the ter­ri­tory and map re­flect the same thing but the map con­tains a con­densed ver­sion; the ex­act po­si­tion of the laces were not deemed im­por­tant and hence were filtered out by our per­cep­tion.

Truth

“The sen­tence ‘snow is white’ is true if, and only if, snow is white.”—Alfred Tarski

What is be­ing said here is that if the re­al­ity is that ‘snow is white’ then we should be­lieve that ‘snow is white’. In other words, we should try to make our be­lief match re­al­ity. Un­for­tu­nately we can­not di­rectly tell if in re­al­ity snow is white but given enough ev­i­dence we should be­lieve it as true.

By de­fault, our sub­con­scious be­lieves what it sees, it has to. You have no time to ques­tion your be­lief if a lion is com­ing to­wards you. We could not have sur­vived this long as a species if we ques­tioned and tested ev­ery­thing. Yet there are some times where we are re­pro­ducibly, pre­dictably, ir­ra­tional. That is, we do not up­date our be­lief cor­rectly based upon ev­i­dence.

Rationality

  • Epistemic ra­tio­nal­ity: be­liev­ing, and up­dat­ing on ev­i­dence, so as to sys­tem­at­i­cally im­prove the cor­re­spon­dence be­tween your map and the ter­ri­tory. The art of ob­tain­ing be­liefs that cor­re­spond to re­al­ity as closely as pos­si­ble. This cor­re­spon­dence is com­monly termed “truth” or “ac­cu­racy”.

  • In­stru­men­tal ra­tio­nal­ity: achiev­ing your val­ues. Not nec­es­sar­ily “your val­ues” in the sense of be­ing self­ish val­ues or un­shared val­ues: “your val­ues” means any­thing you care about. The art of choos­ing ac­tions that steer the fu­ture to­ward out­comes ranked higher in your prefer­ences. Some­times referred to as “win­ning”.

Th­ese defi­ni­tions are the two kinds of ra­tio­nal­ity used here. Epistemic ra­tio­nal­ity is mak­ing the map match the ter­ri­tory. An ex­am­ple of an epistemic ra­tio­nal­ity er­ror would be be­liev­ing a cloud of steam is a ghost. In­stru­men­tal ra­tio­nal­ity is about do­ing what you should based upon what you value. An in­stru­men­tal ra­tio­nal­ity er­ror would be work­ing rather than go­ing to your friend’s birth­day when your val­ues say you should have gone.

Why Should We Be Ra­tional?

  • Cu­ri­os­ity – an in­nate de­sire to know things. e.g. “how does that work?”

  • Prag­ma­tism – we can bet­ter achieve what we want in the world if we un­der­stand it bet­ter. e.g. “I want to build an aero­plane; there­fore I need to know about lift and aero­dy­nam­ics”, “I want some milk; I need to know whether to go to the fridge or store”. If we are ir­ra­tional then our be­lief may cause a plane crash or walk­ing to the store when there is milk in the fridge.

  • Mo­ral­ity—the be­lief that truth seek­ing is im­por­tant to so­ciety, and there­fore it is a moral re­quire­ment to be ra­tio­nal.

What Ra­tion­al­ity Is And Is Not

Be­ing ra­tio­nal does not mean think­ing with­out emo­tion. Be­ing ra­tio­nal and be­ing log­i­cal are differ­ent things. It may be more ra­tio­nal for you to go along with some­thing you be­lieve to be in­cor­rect if it fits with your val­ues.

For ex­am­ple, in an ar­gu­ment with a friend it may be log­i­cal to stick to what you know is true but ra­tio­nally you may just con­cede a point of ar­gu­ment and help them even if you think it is not in their best in­ter­est. It all de­pends on your val­ues and fol­low­ing your val­ues is part of the defi­ni­tion of in­stru­men­tal ra­tio­nal­ity.

If you say “the ra­tio­nal thing to do is X but the right thing is Y” then you are us­ing a differ­ent defi­ni­tion for ra­tio­nal­ity than is in­tended here. The right thing is always the ra­tio­nal thing by defi­ni­tion.

Ra­tion­al­ity also differs for differ­ent peo­ple. In other words the same ac­tion may be ra­tio­nal for one per­son and ir­ra­tional for an­other. This could be due to:

  • Differ­ent re­al­ities. For ex­am­ple, liv­ing in a coun­try with deadly spi­ders should give you more rea­son to be afraid of them. Hence the be­lief that spi­ders are scary is only ra­tio­nal in cer­tain coun­tries.

  • Differ­ent val­ues. For ex­am­ple two peo­ple may agree on how un­likely it is to win the lot­tery but one may still value the prize enough to en­ter. Hence play­ing the lot­tery can be ei­ther ra­tio­nal or ir­ra­tional de­pend­ing on your val­ues.

  • In­cor­rect be­liefs. If you be­lieve that a light-bulb will work with­out elec­tric­ity and do not have suffi­cient ev­i­dence to sup­port the claim then your be­lief is wrong. In fact if you be­lieve any­thing with­out suffi­cient ev­i­dence then you are wrong but what con­sti­tutes suffi­cient ev­i­dence is down to your val­ues and hence is an­other valid rea­son for be­liefs to differ.

When con­fronting some­one who has differ­ent be­liefs this could be down to the points above, in which case it is worth try­ing to see the world through their lens to un­der­stand how their be­lief came about. You may still con­clude that their be­lief is wrong. There is no prob­lem with this. Not all be­liefs are based in ra­tio­nal­ity.

An im­por­tant note in such dis­cus­sions is to con­sider whether you are ac­tu­ally ar­gu­ing over differ­ent points or us­ing differ­ent defi­ni­tions. For ex­am­ple, two peo­ple may ar­gue about how many peo­ple live in New York be­cause each is us­ing a slightly differ­ent defi­ni­tion of New York. The same sort of thing hap­pens when the old say­ing “if a tree falls in the woods and no one is around to hear it does it make a sound?”. Peo­ple may have a differ­ent an­swer for this based upon their defi­ni­tion of “makes a sound” but there ex­pected ex­pe­riences are gen­er­ally the same; that there are sound waves but not the per­cep­tion of sound.

Science and Rationality

Science is a sys­tem of ac­quiring knowl­edge based on sci­en­tific method, and the or­ga­nized body of knowl­edge gained through such re­search. In other words it is the act of test­ing the­o­ries and gain­ing in­for­ma­tion from those tests. A the­ory pro­poses an ex­pla­na­tion for an event, that’s all it is. This the­ory may or may not be use­ful; a use­ful the­ory is one that is:

  1. Log­i­cally con­sis­tent,

  2. Can be used as a tool for pre­dic­tion.

If this seems fa­mil­iar to ra­tio­nal­ity then you are cor­rect. Science at­tempts to ob­tain a true map of re­al­ity us­ing a spe­cific set of tech­niques. It is in essence a more for­mal­ised ver­sion of ra­tio­nal­ity.

A the­ory is very strongly linked to a be­lief. In­deed they should both have the two traits listed above; to be log­i­cally con­sis­tent and to be a pre­dic­tor of events. Just like a the­ory, a be­lief pro­poses an ex­pla­na­tion for events.

The Lens That Sees Its Flaws

We, as con­scious be­ings, have the abil­ity to cor­rect the dis­tor­tion to our per­cep­tual filter. We may not be able to see through an op­ti­cal illu­sion (our lens will always be flawed) but we can choose to be­lieve that it is an illu­sion based upon other ev­i­dence and our own thoughts.

In the image on the left be­low (Müller-Lyer illu­sion) we may see the lines as differ­ent lengths but through other ev­i­dence choose not to be­lieve what our senses are tel­ling us. The image to the right (Munker illu­sion) is an illu­sion that is even harder to be­lieve: the red and pur­ple look­ing colours on the top parts are ac­tu­ally the same; as is the green and turquoise look­ing ones on the bot­tom image. I hope that you check this for your­self. Even af­ter you con­vince your­self of the illu­sion it will still be very hard to see them as the same colour. We may never be able to fix the flaws in our per­cep­tion but by be­ing aware of them we can re­duce the mis­takes we make be­cause of it.

Our Mo­du­lar Brain

We don’t know what many of our de­sires are, we don’t know where they come from, and we can be wrong about our own mo­ti­va­tions. For ex­am­ple, we may think the de­sire to give oral sex is for the plea­sure of our part­ner. How­ever our bod­ies cre­ate this de­sire as a test for health, fer­til­ity and in­fidelity. We con­sciously feel the de­sire and as­cribe rea­son to it but it can be a differ­ent rea­son to the sub­con­scious one.


Be­cause of this we should treat the sig­nals com­ing from our sub­con­scious as dis­torted by yet an­other lens, one that hides much of what is be­hind it. We (speak­ing as the con­scious) no­tice our sub­con­scious de­sires and try to in­fer why they have oc­curred. Yet this un­der­stand­ing may be wrong. Again these flaws in our lens can be cor­rected. You prob­a­bly already tem­per the amount of sugar and fat you eat even though you have a sub­con­scious de­sire to eat more.

Another ex­am­ple is that there are no se­ri­ously dan­ger­ous spi­ders in the UK so there is no ra­tio­nal rea­son to be afraid of them. Yet is seems to be a uni­ver­sal trait. On see­ing a spi­der and feel­ing afraid, we can recog­nise that par­tic­u­lar flaw in our sub­con­scious rea­son­ing and choose to act differ­ently by not run­ning away.

What should we do?

To be epistemi­cly ra­tio­nal we must look for sys­tem­atic flaws in the per­cep­tual filters (lenses) be­tween re­al­ity and our brain as well as within differ­ent parts of the brain. We must then train our­selves to recog­nise and cor­rect them when they oc­cur. Sim­ply talk­ing or think­ing about them is not enough, ac­tive train­ing in recog­nis­ing and deal­ing with such er­rors is needed.

To be in­stru­men­tally ra­tio­nal we must define our val­ues, up­dat­ing them as needed, and learn how to achieve these val­ues. Sim­ply hav­ing well defined val­ues is not enough, ev­ery­thing we do should work to­wards achiev­ing them in some way. There are meth­ods that can be learnt to help with achiev­ing goals and it would make sense to learn these es­pe­cially if you are prone to pro­cras­ti­na­tion. The links be­low are a few ex­am­ples of such things on pro­cras­ti­na­tion, self help, and achiev­ing goals.

One of the most im­por­tant con­cepts to grasp is the best way to up­date our be­liefs based upon what we ex­pe­rience (the ev­i­dence). Thomas Bayes for­mu­lated this math­e­mat­i­cally such that if we were be­hav­ing ra­tio­nally we would ex­pect to fol­low Bayes’ The­o­rem.

Re­search shows that, in some cases at least, peo­ple do not gen­er­ally fol­low this model. Hence one of our first goals should be to think about ev­i­dence in a Bayesian way. See http://​yud­kowsky.net/​ra­tio­nal/​bayes for an in­tu­itive ex­pla­na­tion of Bayes’ The­o­rem.

One im­por­tant con­cept comes out of the the­o­rem that I will briefly in­tro­duce here. Ev­i­dence must up­date our ex­ist­ing be­liefs not re­place them. If a test comes up pos­i­tive for can­cer the prob­a­bil­ity that you have it de­pends on the ac­cu­racy of the test AND the prevalence for can­cer in the gen­eral pop­u­la­tion. This is likely to seem strange un­less you think of it in a Bayesian way.

Summary

  • The phys­i­cal world is re­al­ity.

  • In­side our brains we have be­liefs.

  • Our be­liefs are meant to mir­ror re­al­ity.

  • A good be­lief and a good sci­en­tific the­ory is:

    • Log­i­cally con­sis­tent – fits in with ev­ery other good be­lief.

    • A pre­dic­tor of events – helps you pre­dict the fu­ture and ex­plain past events.

  • Our per­cep­tion of the world can dis­tort the be­liefs.

  • We can change how we per­ceive things through con­scious thought.

  • This can re­duce the er­ror be­tween re­al­ity and our be­liefs.

  • This is called ra­tio­nal­ity.