Purpose and Pragmatism

Fol­lowup to: Mak­ing Beliefs Pay Rent, Lost Pur­poses

Thus runs the an­cient parable:

If a tree falls in a for­est and no one hears it, does it make a sound?
One says, “Yes it does, for it makes vibra­tions in the air.”
Another says, “No it does not, for there is no au­di­tory pro­cess­ing in any brain.”

So be­gins a long, ac­rimo­nious bat­tle...

The con­ven­tional re­s­olu­tion is that the two are fight­ing over the defi­ni­tion of a word, and such la­bels do not have in­trin­sic defi­ni­tions, only agreed-upon defi­ni­tions.

Yet if you need to know about the for­est for any prag­matic rea­son—if there is any­thing you plan on do­ing with the knowl­edge—then the an­swer is no longer a mat­ter of mu­tual agree­ment. If, for ex­am­ple, you need to know whether land­mines will be set off by the tree fal­ling, then you can­not make the land mines ex­plode or un­ex­plode by any pos­si­ble amount of agree­ment about the mean­ing of the word “sound”. You can get the whole world to agree, one way or the other, and it still won’t make a differ­ence.

You find your­self in an un­heard-fal­ling-tree dilemma, only when you be­come cu­ri­ous about a ques­tion with no prag­matic use, and no pre­dic­tive con­se­quences. Which sug­gests that you may be play­ing loose with your pur­poses.

So does this mean that truth re­duces to use­ful­ness? But this, it­self, would be a pur­pose-loss, a sub­goal stomp, a mis­tak­ing of the in­di­ca­tor for the in­di­cated. Use­ful­ness for pre­dic­tion, and demon­strated pow­ers of ma­nipu­la­tion, is one of the best in­di­ca­tors of truth. This does not mean that use­ful­ness is truth. You might as well say that the act of driv­ing to the su­per­mar­ket is eat­ing choco­late.

There is, nonethe­less, a deep similar­ity be­tween the prag­matic and the epistemic arts of ra­tio­nal­ity, in the mat­ter of keep­ing your eye on the ball.

In prag­matic ra­tio­nal­ity, keep­ing your eye on the ball means hold­ing to your pur­pose: Be­ing aware of how each act leads to its con­se­quence, and not los­ing sight of util­ities in leaky gen­er­al­iza­tions about ex­pected util­ities. If you hold firmly in your mind the image of a drained swamp, you will be less likely to get lost in fight­ing al­li­ga­tors.

In epistemic ra­tio­nal­ity, keep­ing your eye on the ball means hold­ing to your ques­tion: Be­ing aware of what each in­di­ca­tor says about its in­di­ca­tee, and not los­ing sight of the origi­nal ques­tion in fights over in­di­ca­tors. If you want to know whether land­mines will deto­nate, you will not get lost in fight­ing over the mean­ing of the word “sound”.

Both cases deal with leaky gen­er­al­iza­tions about con­di­tional prob­a­bil­ities: P(Y=y|X=x) is nearly but not quite 1.

In the case of prag­matic ra­tio­nal­ity: driv­ing to the su­per­mar­ket may al­most always get you choco­late, but on some oc­ca­sions it will not. If you for­get your fi­nal pur­pose and think that x=y then you will not be able to deal with cases where the su­per­mar­ket is out of choco­late.

In the case of epistemic ra­tio­nal­ity: see­ing a “Cho­co­late for sale” sign in the su­per­mar­ket may al­most always in­di­cate that choco­late is available, but on some oc­ca­sions it will not. If you for­get your origi­nal ques­tion and think that x=y then you will go on ar­gu­ing “But the sign is up!” even when some­one calls out to you, “Hey, they don’t have any choco­late to­day!”

This is a deep con­nec­tion be­tween the hu­man arts of prag­matic and epistemic ra­tio­nal­ity...

...which does not mean they are the same thing.