Slack for your belief system

Fol­low-up to Zvi’s post on Slack

You can have Slack in your life. But you can also have Slack in your be­lief sys­tem.

Ini­tially, this seems like it might be bad.

Won’t Slack re­sult in a lack of pre­ci­sion? If I give my­self Slack to be­lieve in what­ever, won’t I just end up with a lot of wrong be­liefs? Shouldn’t I always be try­ing to de­crease the amount of Slack in my be­liefs, always striv­ing to walk the nar­row, true path?

Claims:

  1. For some things, the only way to stum­ble upon the Truth is to have some Slack. In other words, hav­ing no Slack in your be­lief sys­tem can re­sult in get­ting stuck at lo­cal op­tima.

  2. Hav­ing Slack al­lows you to use fake frame­works in a way that isn’t epistem­i­cally harm­ful.

  3. If you are, in fact, just cor­rect, I guess you should have zero Slack. But—just check­ing—are you ALSO cor­rect about how you come to Know Things? If your way of com­ing to con­clu­sions is even a lit­tle off, giv­ing your­self zero Slack might be dan­ger­ous. (Hav­ing zero Slack in your meta pro­cess mul­ti­plies the prob­lem of no-Slack to all down­stream be­liefs.)

  4. I’m will­ing to make the more un­backed, harder-to-define claim that there ex­ists no in­di­vi­d­ual hu­man al­ive who should have zero Slack in their be­liefs, on the meta level. (In other words, no hu­man has a truth-seek­ing pro­cess that will re­li­ably get all the right an­swers.)

[ I want to note that I fully be­lieve I could be wrong about all four claims here, or think­ing about this in the en­tirely wrong way. So fight me. ]

Now, I’m go­ing to speci­fi­cally dis­cuss Slack in one’s meta pro­cess.

So, while I can ap­ply the con­cept of Slack to in­di­vi­d­ual be­liefs them­selves (aka “hold­ing be­liefs lightly”), I am ap­ply­ing the con­cept more to the ques­tion of “How do I come to know/​un­der­stand any­thing or or call a thing true?”

So, I’m not dis­cussing ex­am­ples of “I be­lieve X, with more or less Slack.” I’m dis­cussing the differ­ence be­tween, “Do­ing a bunch of stud­ies is the only way to know things” (less Slack) vs. “Do­ing a bunch of stud­ies is how I cur­rently come to know things, but I’m open to other ways” (more Slack).

The less Slack there is in your pro­cess for form­ing be­liefs, the more con­straints you have to abide be­fore be­ing able to claim you’ve come to un­der­stand some­thing.

Ex­am­ples of such con­straints in­clude:

  • I only buy it if it has had at least one peer-re­viewed RCT.

  • This frame­work seems like it’ll lead to con­fir­ma­tion bias, so I will ig­nore it.

  • If it in­volves poli­tics or trib­al­ism or sta­tus, it can’t have any truth to it.

  • If it’s self-con­tra­dic­tory /​ para­dox­i­cal, it has to be one way or the other.

  • I can’t imag­ine this be­ing true or use­ful be­cause my gut re­ac­tion to it is nega­tive.

  • I don’t feel any­thing about it, so it must be mean­ingless.

  • This doesn’t con­form to my nar­ra­tive or wor­ld­view. In fact it’s offen­sive to con­sider, so I won’t.

  • If I thought this, it would likely re­sult in harm to my­self or oth­ers, so I can’t think it.

  • It’s only true if I can prove it.

  • It’s only worth con­sid­er­ing if it’s been tested em­piri­cally.

  • I should dis­card mod­els that aren’t made of gears.

Note that some­times, it is good to have such con­straints, at least for now.

Not ev­ery­one can in­ter­act with facts, claims, and be­liefs with­out some harm to their epistemics. In fact, most peo­ple can­not, I claim. (And fur­ther, I be­lieve this to be one of the most im­por­tant prob­lems in ra­tio­nal­ity.)

That said, I see a lot of peo­ple’s ori­en­ta­tions as:

“My be­lief-form­ing pro­cess says this thing isn’t true, and in fact this en­tire class of thing is likely false and not worth dig­ging into. You seem to be ac­tively en­gag­ing with [class of thing] and claiming there is truth in it. That seems highly du­bi­ous—there is some­thing wrong with your be­lief-form­ing pro­cess.”

This is a rea­son­able stance to take.

After all, lots of things aren’t worth dig­ging into. And lots of peo­ple have bad truth-seek­ing pro­cesses. Theirs may very well be worse than yours; you don’t have to con­sider some­thing just be­cause it’s in front of you.

But if you no­tice your­self un­will­ing to en­gage with [en­tire class of thing]… to me this in­di­cates some­thing is sub­op­ti­mal.

Over time, it seems good to aim for be­ing able to en­gage with more classes of things, rather than fewer.

If some­thing is poli­ti­cally charged, yes, your be­liefs are at risk, and you may be bet­ter off avoid­ing the topic al­to­gether. But—wouldn’t it be nice, if one day, you could wade through the mire of poli­tics and come out the other side, clean? Epistemics in tact? Even bet­ter, you come out the other side hav­ing re­al­ized new truths about the world?

I guess if I’m go­ing to be to­tally hon­est, the rea­son I am say­ing this is be­cause I feel an­noyed when peo­ple dis­miss en­tire [classes of thing] for rea­sons like, “That part of the ter­ri­tory is re­ally swampy and dan­ger­ous! Go­ing in there is bad, and you’re prob­a­bly com­pro­mised.”

At least some of the time, the thing that is go­ing on is the per­son just figured out how to nav­i­gate swamps.

But in­stead, I feel like the per­son lacks Slack in their be­lief-form­ing pro­cess and is also try­ing to en­force this lack of Slack onto oth­ers.

From the in­side, I imag­ine this feels like, “No one can nav­i­gate swamps, and any­one who says they are is prob­a­bly ter­ribly mis­taken or naive about how truth-seek­ing works, so I should in­form them of the dan­ger.”

From the in­side, Slack will feel in­cor­rect or po­ten­tially dan­ger­ous. Without con­straints, the per­son may feel like they’ll go off the rails—maybe they’ll even end up be­liev­ing in *gasp* horo­scopes or *gasp* the ex­is­tence of a Judeo-Chris­tian God.

My great­est fear is not hav­ing false be­liefs. My great­est fear is get­ting trapped into a par­tic­u­lar defi­ni­tion of truth-seek­ing, such that I per­ma­nently end up with many false be­liefs or large gaps in my map.

The two things I do to avoid this are:

a) Learn more skills for nav­i­gat­ing tricky ter­ri­to­ries. For ex­am­ple, one of the skills is notic­ing a be­lief that’s in my mind be­cause it would be benefi­cial for me to be­lieve it, i.e. it makes me feel good in a cer­tain way or I ex­pect good things to hap­pen as a re­sult—say, it’d make a per­son like me more if I be­lieved it. This likely re­quires a fair amount of in­tro­spec­tive ca­pac­ity.

b) Be open to the idea that other peo­ple have truth-seek­ing meth­ods that I don’t. That they’re see­ing en­tire swaths of re­al­ity I can’t see. Be cu­ri­ous about that, and try to learn more. Develop taste around this. Main­tain some Slack, so I don’t be­come my­opic.