Separation of Concerns

Sepa­ra­tion of con­cerns is a prin­ci­ple in com­puter sci­ence which says that dis­tinct con­cerns should be ad­dressed by dis­tinct sub­sys­tems, so that you can op­ti­mize for them sep­a­rately. We can also ap­ply the idea in many other places, in­clud­ing hu­man ra­tio­nal­ity. This idea has been writ­ten about be­fore. I’m not try­ing to make a com­pre­hen­sive post about it, just re­mark on some things I re­cently though about.

Epistemic vs Instrumental

The most ob­vi­ous ex­am­ple is be­liefs vs de­sires. Although the dis­tinc­tion may not be a perfect sep­a­ra­tion-of-con­cerns in prac­tice (or even in prin­ci­ple), at least I can say this:

  • Even non-ra­tio­nal­ists find it use­ful to make a rel­a­tively firm dis­tinc­tion be­tween what is true and what they want to be true;

  • Ra­tion­al­ists, sci­en­tists, and in­tel­lec­tu­als of many va­ri­eties tend to value an es­pe­cially sharp dis­tinc­tion of this kind.

I’m par­tic­u­larly think­ing about how the dis­tinc­tion is used in con­ver­sa­tion. If an es­pe­cially sharp dis­tinc­tion isn’t be­ing made, you might see things like:

No­tice that this isn’t an easy dis­tinc­tion to make. It isn’t right at all to just ig­nore con­ver­sa­tional im­pli­ca­ture. You should not only make literal state­ments, nor should you just as­sume that ev­ery­one else is do­ing that. The skill is more like, raise the literal con­tent of words as a hy­poth­e­sis; make a dis­tinc­tion in your mind be­tween what is said and any­thing else which may have been meant.

Side note—as with many con­ver­sa­tion norms, the dis­tinc­tions I’m men­tion­ing in this post can­not be im­posed on a con­ver­sa­tion unilat­er­ally. Some­times sim­ply point­ing out a dis­tinc­tion works; but gen­er­ally, one has to meet a con­ver­sa­tion where it’s at, and only gen­tly try to pull it to a bet­ter place. If you’re in a dis­cus­sion which is strongly failing to make a true-vs-use­ful dis­tinc­tion, sim­ply point­ing out ex­am­ples of the prob­lem will very likely be taken as an at­tack, mak­ing the prob­lem worse.

Mak­ing a dis­tinc­tion be­tween epistemics and in­stru­men­tal­ity seems like a kind of “uni­ver­sal solvent” for cog­ni­tive sep­a­ra­tion of con­cerns—the rest of the ex­am­ples I’m go­ing to men­tion feel like con­se­quences of this one, to some ex­tent. I think part of the rea­son for this is that “truth” is a con­cept which has a lot of sep­a­ra­tion-of-con­cerns built in: it’s not just that you con­sider truth sep­a­rately from use­ful­ness; you also con­sider the truth of each in­di­vi­d­ual state­ment sep­a­rately, which cre­ates a scaf­fold­ing to sup­port a huge va­ri­ety of sep­a­ra­tion-of-con­cerns (any time you’re able to make an ex­plicit dis­tinc­tion be­tween differ­ent as­ser­tions).

But the dis­tinc­tion is also very broad. Ac­tu­ally, it’s kind of a mess—it feels a bit like “truth vs ev­ery­thing else”. Ear­lier, I tried to char­ac­ter­ize it as “what’s true vs what you want to be true”, but taken liter­ally, this only cap­tures a nar­row case of what I’m point­ing at. There are many differ­ent goals which state­ments can op­ti­mize be­sides truth.

  • You could want to be­lieve some­thing be­cause you want it to be true—per­haps you can’t stand think­ing about the pos­si­bil­ity of it be­ing false.

  • You could want to claim some­thing be­cause it helps ar­gue for/​against some side in a de­ci­sion which you want to in­fluence, or for/​against some other be­lief which you want to hold for some other rea­son.

  • You could want to be­lieve some­thing be­cause the be­hav­iors en­couraged by the be­lief are good—per­haps you ex­er­cise more if you be­lieve it will make you lose weight; per­haps ev­ery­one be­liev­ing in karma, or heaven and hell, makes for a stronger and more co­op­er­a­tive com­mu­nity.

Sim­ply put, there are a wide va­ri­ety of in­cen­tives on be­liefs and claims. There wouldn’t even be a con­cept of ‘be­lief’ or ‘claim’ if we didn’t sep­a­rate out the idea of truth from all the other rea­sons one might be­lieve/​claim some­thing, and op­ti­mize for it sep­a­rately. Yet, it is kind of fas­ci­nat­ing that we do this even to the de­gree that we do—how do we suc­cess­fully iden­tify the ‘truth’ con­cern in the first place, and sort it out from all the other in­cen­tives on our be­liefs?

Ar­gu­ment vs Premises and Conclusion

Another im­por­tant dis­tinc­tion is to sep­a­rate the eval­u­a­tion of hy­po­thet­i­cal if-then state­ments from any con­cern with the truth of their premises or con­clu­sions. A com­mon com­plaint among the more logic-minded, of the less, is that hardly any­one is ca­pa­ble of prop­erly dis­t­in­guish­ing the claim “If X, then Y” from the claim “X, and also Y”.

It could be that a lack of a very sharp truth-vs-im­pli­ca­ture dis­tinc­tion is what blocks peo­ple from mak­ing an if-vs-and dis­tinc­tion. Why would you be claiming “If X, then Y” if not to then say “by the way, X; so, Y”? (There are ac­tu­ally lots of rea­sons, but, they’re all much less com­mon than mak­ing an ar­gu­ment be­cause you be­lieve the premises and want to ar­gue the con­clu­sion—so, that’s the com­monly un­der­stood im­pli­ca­ture.)

How­ever, it’s also pos­si­ble to suc­cess­fully make the “truth” dis­tinc­tion but not the “hy­po­thet­i­cal” dis­tinc­tion. Hy­po­thet­i­cal rea­son­ing is a tricky skill. Even if you suc­cess­fully make the dis­tinc­tion when it is pointed out ex­plic­itly, I’d guess that there are times when you fail to make it in con­ver­sa­tion or pri­vate thought.

Prefer­ences vs Bids

The main rea­son I’m writ­ing this post is ac­tu­ally be­cause this dis­tinc­tion hit me re­cently. You can say that you want some­thing, or say how you feel about some­thing, with­out it be­ing a bid for some­one to do some­thing about it. This is both close to the over­all topic of In My Cul­ture and a spe­cific ex­am­ple (like, listed as an ex­am­ple in the post).

Ac­tu­ally, let’s split this up into cases:

Prefer­ences about so­cial norms vs bids for those so­cial norms to be in place. This is more or less the point of the In My Cul­ture ar­ti­cle; say­ing “in my cul­ture” be­fore some­thing to put a lit­tle dis­tance be­tween the con­ver­sa­tion and the preferred norm, so that it is put on the table as an in­vi­ta­tion rather than be­ing per­ceived as a re­quire­ment.

Pro­pos­als and prefer­ences vs bids. Imag­ine a con­ver­sa­tion about what restau­rant to go to. Often, peo­ple run into a prob­lem: no one has any prefer­ences; ev­ery­one is fine with what­ever. No one is will­ing to make any pro­pos­als. One rea­son why this might hap­pen is that pro­pos­als, and prefer­ences, are per­ceived as bids. No one wants to take the blame for a bad plan; no one wants to be seen as self­ish or neg­li­gent of other’s prefer­ences. So, there’s a nat­u­ral in­cli­na­tion to lose touch with your prefer­ences; you re­ally feel like you don’t care, and like you can’t think of any op­tions. If a strong dis­tinc­tion be­tween prefer­ences and bids is made, it gets eas­ier to state what you pre­fer, trust­ing that the group will take it as only one data point of many to be taken to­gether. If a dis­tinc­tion be­tween pro­pos­als and bids is made, it will be eas­ier to list what­ever comes to mind, and to think of places you’d ac­tu­ally like to go.

Feel­ings vs bids. I think this one comes less nat­u­rally to peo­ple who make a strong truth dis­tinc­tion—there’s some­thing about di­rect­ing at­ten­tion to­ward the literal truth of state­ments which di­rects at­ten­tion away from how you feel about them, even though how you feel is some­thing you can also try to have true be­liefs about. So, in prac­tice, peo­ple who make an es­pe­cially strong truth dis­tinc­tion may nonethe­less treat state­ments about feel­ings as if they were state­ments about the things the feel­ings are about, pre­cisely be­cause they’re hy­per­sen­si­tive to other peo­ple failing to make that dis­tinc­tion. So: know that you can say how you feel about some­thing with­out it be­ing any­thing more. Feel­ing an­gry about some­one’s state­ment doesn’t have to be a bid for them to take it back, or a claim that it is false. Feel­ing sad doesn’t have to be a bid for at­ten­tion. An emo­tion doesn’t even have to re­flect your more con­sid­ered prefer­ences.

When a group of peo­ple is skil­led at mak­ing a truth dis­tinc­tion, cer­tain kinds of con­ver­sa­tion, and cer­tain kinds of think­ing, be­come much eas­ier: all sorts of be­liefs can be put out into the open where they oth­er­wise couldn’t, al­low­ing the col­lec­tive knowl­edge to go much fur­ther. Similarly, when a group of peo­ple is skil­led at the feel­ings dis­tinc­tion, I ex­pect things can go places where they oth­er­wise couldn’t. If you can men­tion in pass­ing that some­thing ev­ery­one else seems to like makes you sad, with­out it be­com­ing a big deal. If there is suffi­cient trust that you can say how you are feel­ing about things, in de­tail, with­out ex­pect­ing it to make ev­ery­thing com­pli­cated.

The main rea­son I wrote this post is that some­one was talk­ing about this kind of in­ter­ac­tion, and I ini­tially didn’t see it as very pos­si­ble or nec­es­sar­ily de­sir­able. After think­ing about it more, the anal­ogy to mak­ing a strong truth dis­tinc­tion hit me. Some­one stuck in a cul­ture with­out a strong truth dis­tinc­tion might similarly see such a dis­tinc­tion as ‘not pos­si­ble or de­sir­able’: the use­ful­ness of an as­ser­tion is ob­vi­ously more im­por­tant than its truth; in re­al­ity, be­ing overly ob­sessed with truth will both make you vuln­er­a­ble (if you say true things naively) and ig­no­rant (if you take state­ments at face value too much, ig­nor­ing con­no­ta­tion and im­pli­ca­ture); even if it were pos­si­ble to set aside those is­sues, what’s the use of say­ing a bunch of true stuff? Does it get things done? Similarly: the truth of the mat­ter is more im­por­tant than how you feel about it; in re­al­ity, stat­ing your true feel­ings all the time will make you vuln­er­a­ble and per­ceived as needy or emo­tional; even if you could set those things aside, what’s the point of talk­ing about feel­ings all the time?

Now it seems both pos­si­ble and sim­ply good, for the same rea­son that a strong truth dis­tinc­tion is.

I can’t say a whole lot about the benefits of such a cul­ture, be­cause I haven’t re­ally ex­pe­rienced it. This kind of thing is part of what cir­cling seems to be about, in my mind. I think the ra­tio­nal­ist com­mu­nity as I’ve ex­pe­rienced it goes some­what in that di­rec­tion, but definitely not all the way.