Moore’s Paradox

I think I un­der­stand Moore’s Para­dox a bit bet­ter now, af­ter read­ing some of the com­ments on Less Wrong. Jim­ran­domh sug­gests:

Many peo­ple can­not dis­t­in­guish be­tween lev­els of in­di­rec­tion. To them, “I be­lieve X” and “X” are the same thing, and there­fore, rea­sons why it is benefi­cial to be­lieve X are also rea­sons why X is true.

I don’t think this is cor­rect—rel­a­tively young chil­dren can un­der­stand the con­cept of hav­ing a false be­lief, which re­quires sep­a­rate men­tal buck­ets for the map and the ter­ri­tory. But it points in the di­rec­tion of a similar idea:

Many peo­ple may not con­sciously dis­t­in­guish be­tween be­liev­ing some­thing and en­dors­ing it.

After all—“I be­lieve in democ­racy” means, col­lo­quially, that you en­dorse the con­cept of democ­racy, not that you be­lieve democ­racy ex­ists. The word “be­lief”, then, has more than one mean­ing. We could be look­ing at a con­fused word that causes con­fused think­ing (or maybe it just re­flects pre-ex­ist­ing con­fu­sion).

So: in the origi­nal ex­am­ple, “I be­lieve peo­ple are nicer than they are”, she came up with some rea­sons why it would be good to be­lieve peo­ple are nice—health benefits and such—and since she now had some warm af­fect on “be­liev­ing peo­ple are nice”, she in­tro­spected on this warm af­fect and con­cluded, “I be­lieve peo­ple are nice”. That is, she mis­took the pos­i­tive af­fect at­tached to the quoted be­lief, as sig­nal­ing her be­lief in the propo­si­tion. At the same time, the world it­self seemed like peo­ple weren’t so nice. So she said, “I be­lieve peo­ple are nicer than they are.”

And that verges on be­ing an hon­est mis­take—sort of—since peo­ple are not taught ex­plic­itly how to know when they be­lieve some­thing. As in the parable of the dragon in the garage; the one who says “There is a dragon in my garage—but it’s in­visi­ble”, does not rec­og­nize his an­ti­ci­pa­tion of see­ing no dragon, as in­di­cat­ing that he pos­sesses an (ac­cu­rate) model with no dragon in it.

It’s not as if peo­ple are trained to rec­og­nize when they be­lieve some­thing. It’s not like they’re ever taught in high school: “What it feels like to ac­tu­ally be­lieve some­thing—to have that state­ment in your be­lief pool—is that it just seems like the way the world is. You should rec­og­nize this feel­ing, which is ac­tual (un­quoted) be­lief, and dis­t­in­guish it from hav­ing good feel­ings about a be­lief that you rec­og­nize as a be­lief (which means that it’s in quote marks).”

This goes a long way to­ward mak­ing this real-life case of Moore’s Para­dox seem less alien, and pro­vid­ing an­other mechanism whereby peo­ple can be si­mul­ta­neously right and wrong.

Like­wise Kurige who wrote:

I be­lieve that there is a God—and that He has in­stil­led a sense of right and wrong in us by which we are able to eval­u­ate the world around us. I also be­lieve a sense of moral­ity has been evolu­tion­ar­ily pro­grammed into us—a sense of moral­ity that is most likely a re­sult of the for­ma­tion of meta-poli­ti­cal coal­i­tions in Bonobo com­mu­ni­ties a very, very long time ago. Th­ese two be­liefs are not con­tra­dic­tory, but the com­plex­ity lies in rec­on­cil­ing the two.

I sus­pect, Kurige, you have de­cided that you have rea­sons to en­dorse the quoted be­lief that God has in­stil­led a sense of right and wrong in us. And also that you have rea­sons to en­dorse the ver­dict of sci­ence. They both seem like good com­mu­ni­ties to join, right? There are benefits to both sets of be­liefs? You in­tro­spect and find that you feel good about both be­liefs?

But you did not say:

“God in­stil­led a sense of right and wrong in us, and also a sense of moral­ity has been evolu­tion­ar­ily pro­grammed into us. The two states of re­al­ity are not in­con­sis­tent, but the com­plex­ity lies in rec­on­cil­ing the two.”

If you’re read­ing this, Kurige, you should very quickly say the above out loud, so you can no­tice that it seems at least slightly harder to swal­low—no­tice the sub­jec­tive differ­ence—be­fore you go to the trou­ble of rera­tional­iz­ing.

This is the sub­jec­tive differ­ence be­tween hav­ing rea­sons to en­dorse two differ­ent be­liefs, and your men­tal model of a sin­gle world, a sin­gle way-things-are.