The Least Convenient Possible World

Re­lated to: Is That Your True Re­jec­tion?

“If you’re in­ter­ested in be­ing on the right side of dis­putes, you will re­fute your op­po­nents’ ar­gu­ments. But if you’re in­ter­ested in pro­duc­ing truth, you will fix your op­po­nents’ ar­gu­ments for them. To win, you must fight not only the crea­ture you en­counter; you must fight the most hor­rible thing that can be con­structed from its corpse.”

-- Black Belt Bayesian, via Ra­tion­al­ity Quotes 13

Yes­ter­day John Maxwell’s post won­dered how much the av­er­age per­son would do to save ten peo­ple from a ruth­less tyrant. I re­mem­ber ask­ing some of my friends a vaguely re­lated ques­tion as part of an in­ves­ti­ga­tion of the Trol­ley Prob­lems:

You are a doc­tor in a small ru­ral hos­pi­tal. You have ten pa­tients, each of whom is dy­ing for the lack of a sep­a­rate or­gan; that is, one per­son needs a heart trans­plant, an­other needs a lung trans­plant, an­other needs a kid­ney trans­plant, and so on. A trav­el­ler walks into the hos­pi­tal, men­tion­ing how he has no fam­ily and no one knows that he’s there. All of his or­gans seem healthy. You re­al­ize that by kil­ling this trav­el­ler and dis­tribut­ing his or­gans among your pa­tients, you could save ten lives. Would this be moral or not?

I don’t want to dis­cuss the an­swer to this prob­lem to­day. I want to dis­cuss the an­swer one of my friends gave, be­cause I think it illu­mi­nates a very in­ter­est­ing kind of defense mechanism that ra­tio­nal­ists need to be watch­ing for. My friend said:

It wouldn’t be moral. After all, peo­ple of­ten re­ject or­gans from ran­dom donors. The trav­el­ler would prob­a­bly be a ge­netic mis­match for your pa­tients, and the trans­plan­tees would have to spend the rest of their lives on im­muno­sup­pres­sants, only to die within a few years when the drugs failed.

On the one hand, I have to give my friend credit: his an­swer is biolog­i­cally ac­cu­rate, and be­yond a doubt the tech­ni­cally cor­rect an­swer to the ques­tion I asked. On the other hand, I don’t have to give him very much credit: he com­pletely missed the point and lost a valuable effort to ex­am­ine the na­ture of moral­ity.

So I asked him, “In the least con­ve­nient pos­si­ble world, the one where ev­ery­one was ge­net­i­cally com­pat­i­ble with ev­ery­one else and this ob­jec­tion was in­valid, what would you do?”

He mum­bled some­thing about coun­ter­fac­tu­als and re­fused to an­swer. But I learned some­thing very im­por­tant from him, and that is to always ask this ques­tion of my­self. Some­times the least con­ve­nient pos­si­ble world is the only place where I can figure out my true mo­ti­va­tions, or which step to take next. I offer three ex­am­ples:

1: Pas­cal’s Wager. Upon be­ing pre­sented with Pas­cal’s Wager, one of the first things most athe­ists think of is this:

Per­haps God val­ues in­tel­lec­tual in­tegrity so highly that He is pre­pared to re­ward hon­est athe­ists, but will pun­ish any­one who prac­tices a re­li­gion he does not truly be­lieve sim­ply for per­sonal gain. Or per­haps, as the Dis­cor­dians claim, “Hell is re­served for peo­ple who be­lieve in it, and the hottest lev­els of Hell are re­served for peo­ple who be­lieve in it on the prin­ci­ple that they’ll go there if they don’t.”

This is a good ar­gu­ment against Pas­cal’s Wager, but it isn’t the least con­ve­nient pos­si­ble world. The least con­ve­nient pos­si­ble world is the one where Omega, the com­pletely trust­wor­thy su­per­in­tel­li­gence who is always right, in­forms you that God definitely doesn’t value in­tel­lec­tual in­tegrity that much. In fact (Omega tells you) ei­ther God does not ex­ist or the Catholics are right about ab­solutely ev­ery­thing.

Would you be­come a Catholic in this world? Or are you will­ing to ad­mit that maybe your re­jec­tion of Pas­cal’s Wager has less to do with a hy­poth­e­sized pro-athe­ism God, and more to do with a be­lief that it’s wrong to aban­don your in­tel­lec­tual in­tegrity on the off chance that a crazy de­ity is play­ing a per­verted game of blind poker with your eter­nal soul?

2: The God-Shaped Hole. Chris­ti­ans claim there is one in ev­ery athe­ist, keep­ing him from spiritual fulfill­ment.

Some com­menters on Rais­ing the San­ity Water­line don’t deny the ex­is­tence of such a hole, if it is in­tepreted as a de­sire for pur­pose or con­nec­tion to some­thing greater than one’s self. But, some com­menters say, sci­ence and ra­tio­nal­ity can fill this hole even bet­ter than God can.

What luck! Evolu­tion has by a wild co­in­ci­dence cre­ated us with a big ra­tio­nal­ity-shaped hole in our brains! Good thing we hap­pen to be ra­tio­nal­ists, so we can fill this hole in the best pos­si­ble way! I don’t know—de­spite my sar­casm this may even be true. But in the least con­ve­nient pos­si­ble world, Omega comes along and tells you that sorry, the hole is ex­actly God-shaped, and any­one with­out a re­li­gion will lead a less-than-op­ti­mally-happy life. Do you head down to the near­est church for a bap­tism? Or do you ad­mit that even if be­liev­ing some­thing makes you hap­pier, you still don’t want to be­lieve it un­less it’s true?

3: Ex­treme Altru­ism. John Maxwell men­tions the util­i­tar­ian ar­gu­ment for donat­ing al­most ev­ery­thing to char­ity.

Some com­menters ob­ject that many forms of char­ity, es­pe­cially the clas­sic “give to starv­ing Afri­can or­phans,” are coun­ter­pro­duc­tive, ei­ther be­cause they en­able dic­ta­tors or thwart the free mar­ket. This is quite true.

But in the least con­ve­nient pos­si­ble world, here comes Omega again and tells you that Char­ity X has been proven to do ex­actly what it claims: help the poor with­out any coun­ter­pro­duc­tive effects. So is your real ob­jec­tion the cor­rup­tion, or do you just not be­lieve that you’re morally obli­gated to give ev­ery­thing you own to starv­ing Afri­cans?

You may ar­gue that this cit­ing of con­ve­nient facts is at worst a ve­nial sin. If you still get to the cor­rect an­swer, and you do it by a cor­rect method, what does it mat­ter if this method isn’t re­ally the one that’s con­vinced you per­son­ally?

One easy an­swer is that it saves you from em­bar­rass­ment later. If some sci­en­tist does a study and finds that peo­ple re­ally do have a god-shaped hole that can’t be filled by any­thing else, no one can come up to you and say “Hey, didn’t you say the rea­son you didn’t con­vert to re­li­gion was be­cause ra­tio­nal­ity filled the god-shaped hole bet­ter than God did? Well, I have some bad news for you...”

Another easy an­swer is that your real an­swer teaches you some­thing about your­self. My friend may have suc­cess­fully avoid­ing mak­ing a dis­taste­ful moral judg­ment, but he didn’t learn any­thing about moral­ity. My re­fusal to take the easy way out on the trans­plant ques­tion helped me de­velop the form of prece­dent-util­i­tar­i­anism I use to­day.

But more than ei­ther of these, it mat­ters be­cause it se­ri­ously in­fluences where you go next.

Say “I ac­cept the ar­gu­ment that I need to donate al­most all my money to poor Afri­can coun­tries, but my only ob­jec­tion is that cor­rupt war­lords might get it in­stead”, and the ob­vi­ous next step is to see if there’s a poor Afri­can coun­try with­out cor­rupt war­lords (see: Ghana, Botswana, etc.) and donate al­most all your money to them. Another ac­cept­able an­swer would be to donate to an­other war­lord-free char­i­ta­ble cause like the Sin­gu­lar­ity In­sti­tute.

If you just say “Nope, cor­rupt dic­ta­tors might get it,” you may go off and spend the money on a new TV. Which is fine, if a new TV is what you re­ally want. But if you’re the sort of per­son who would have been con­vinced by John Maxwell’s ar­gu­ment, but you dis­missed it by say­ing “Nope, cor­rupt dic­ta­tors,” then you’ve lost an op­por­tu­nity to change your mind.

So I recom­mend: limit your­self to re­sponses of the form “I com­pletely re­ject the en­tire ba­sis of your ar­gu­ment” or “I ac­cept the ba­sis of your ar­gu­ment, but it doesn’t ap­ply to the real world be­cause of con­tin­gent fact X.” If you just say “Yeah, well, con­ti­gent fact X!” and walk away, you’ve left your­self too much wig­gle room.

In other words: always have a plan for what you would do in the least con­ve­nient pos­si­ble world.