Slack for your belief system

Follow-up to Zvi’s post on Slack

You can have Slack in your life. But you can also have Slack in your belief system.

Initially, this seems like it might be bad.

Won’t Slack result in a lack of precision? If I give myself Slack to believe in whatever, won’t I just end up with a lot of wrong beliefs? Shouldn’t I always be trying to decrease the amount of Slack in my beliefs, always striving to walk the narrow, true path?

Claims:

  1. For some things, the only way to stumble upon the Truth is to have some Slack. In other words, having no Slack in your belief system can result in getting stuck at local optima.

  2. Having Slack allows you to use fake frameworks in a way that isn’t epistemically harmful.

  3. If you are, in fact, just correct, I guess you should have zero Slack. But—just checking—are you ALSO correct about how you come to Know Things? If your way of coming to conclusions is even a little off, giving yourself zero Slack might be dangerous. (Having zero Slack in your meta process multiplies the problem of no-Slack to all downstream beliefs.)

  4. I’m willing to make the more unbacked, harder-to-define claim that there exists no individual human alive who should have zero Slack in their beliefs, on the meta level. (In other words, no human has a truth-seeking process that will reliably get all the right answers.)

[ I want to note that I fully believe I could be wrong about all four claims here, or thinking about this in the entirely wrong way. So fight me. ]

Now, I’m going to specifically discuss Slack in one’s meta process.

So, while I can apply the concept of Slack to individual beliefs themselves (aka “holding beliefs lightly”), I am applying the concept more to the question of “How do I come to know/​understand anything or or call a thing true?”

So, I’m not discussing examples of “I believe X, with more or less Slack.” I’m discussing the difference between, “Doing a bunch of studies is the only way to know things” (less Slack) vs. “Doing a bunch of studies is how I currently come to know things, but I’m open to other ways” (more Slack).

The less Slack there is in your process for forming beliefs, the more constraints you have to abide before being able to claim you’ve come to understand something.

Examples of such constraints include:

  • I only buy it if it has had at least one peer-reviewed RCT.

  • This framework seems like it’ll lead to confirmation bias, so I will ignore it.

  • If it involves politics or tribalism or status, it can’t have any truth to it.

  • If it’s self-contradictory /​ paradoxical, it has to be one way or the other.

  • I can’t imagine this being true or useful because my gut reaction to it is negative.

  • I don’t feel anything about it, so it must be meaningless.

  • This doesn’t conform to my narrative or worldview. In fact it’s offensive to consider, so I won’t.

  • If I thought this, it would likely result in harm to myself or others, so I can’t think it.

  • It’s only true if I can prove it.

  • It’s only worth considering if it’s been tested empirically.

  • I should discard models that aren’t made of gears.

Note that sometimes, it is good to have such constraints, at least for now.

Not everyone can interact with facts, claims, and beliefs without some harm to their epistemics. In fact, most people cannot, I claim. (And further, I believe this to be one of the most important problems in rationality.)

That said, I see a lot of people’s orientations as:

“My belief-forming process says this thing isn’t true, and in fact this entire class of thing is likely false and not worth digging into. You seem to be actively engaging with [class of thing] and claiming there is truth in it. That seems highly dubious—there is something wrong with your belief-forming process.”

This is a reasonable stance to take.

After all, lots of things aren’t worth digging into. And lots of people have bad truth-seeking processes. Theirs may very well be worse than yours; you don’t have to consider something just because it’s in front of you.

But if you notice yourself unwilling to engage with [entire class of thing]… to me this indicates something is suboptimal.

Over time, it seems good to aim for being able to engage with more classes of things, rather than fewer.

If something is politically charged, yes, your beliefs are at risk, and you may be better off avoiding the topic altogether. But—wouldn’t it be nice, if one day, you could wade through the mire of politics and come out the other side, clean? Epistemics in tact? Even better, you come out the other side having realized new truths about the world?

I guess if I’m going to be totally honest, the reason I am saying this is because I feel annoyed when people dismiss entire [classes of thing] for reasons like, “That part of the territory is really swampy and dangerous! Going in there is bad, and you’re probably compromised.”

At least some of the time, the thing that is going on is the person just figured out how to navigate swamps.

But instead, I feel like the person lacks Slack in their belief-forming process and is also trying to enforce this lack of Slack onto others.

From the inside, I imagine this feels like, “No one can navigate swamps, and anyone who says they are is probably terribly mistaken or naive about how truth-seeking works, so I should inform them of the danger.”

From the inside, Slack will feel incorrect or potentially dangerous. Without constraints, the person may feel like they’ll go off the rails—maybe they’ll even end up believing in *gasp* horoscopes or *gasp* the existence of a Judeo-Christian God.

My greatest fear is not having false beliefs. My greatest fear is getting trapped into a particular definition of truth-seeking, such that I permanently end up with many false beliefs or large gaps in my map.

The two things I do to avoid this are:

a) Learn more skills for navigating tricky territories. For example, one of the skills is noticing a belief that’s in my mind because it would be beneficial for me to believe it, i.e. it makes me feel good in a certain way or I expect good things to happen as a result—say, it’d make a person like me more if I believed it. This likely requires a fair amount of introspective capacity.

b) Be open to the idea that other people have truth-seeking methods that I don’t. That they’re seeing entire swaths of reality I can’t see. Be curious about that, and try to learn more. Develop taste around this. Maintain some Slack, so I don’t become myopic.