Epistemic Slipperiness

(epistemic status: thinking out loud, kinda confused)

Sometimes I interact with a person, and something about their thinking feels… slippery.

Probably, sometimes my mind is slippery. But, I can’t quite pin down when.

This is probably actually several different phenomena, but they blur together in my current mind. I’m confused about it. This post is me trying to deconfuse myself. Let’s explore a few examples and see what shakes out.

Subtly Bad Jokes and Slipping Sideways

I first got the concept of ‘epistemic slipping’ from this Eliezer facebook post in 2017, which argued that there was something off about people who made jokes about how Trump winning the presidency meant we must be living in a simulation.

When I’m asking whether some person known to be in the Advanced Epistemology 101 club is *strongly* trustworthy, one of the primary questions I ask is: Does this person slip sideways in reality, even just a little, in order to resolve their internal tensions?

You don’t know in advance what crisis is going to hit. But when a crisis hits, it often creates some mental tension that can be diminished by slipping sideways in reality to a world that is false but not *blatantly* false. No, person Y couldn’t really be lying to me!

And since I see this sideways-slipping as this huge remaining foundational sin even for many people in the AE101 club… well, you can see why I was worried about hearing some people making Simulation Hypothesis jokes about Trump. It expresses a wish for this world not to be real, an impulse to slip sideways out of the tension, even if only by saying that this world is a simulation and so it doesn’t count. The conceptual link to the Simulation Hypothesis seems to show a forming fault-line—like the person’s brain now has a new way to escape from reality in case of crisis. I worry both that the general pressure exists and hasn’t been defeated, and that the Simulation Hypothesis in particular is a fault-line that could crack if the person gets put under enough pressure.

(ADDED: It’s not that the fault-line says a Trump victory is okay because simulations are ontologically unreal. The feeling is rather that simulations don’t count toward the Laws that Must Be Obeyed, the socially-valent generalizations whose violation feels uncomfortable. It was mandatory for Hillary Clinton to win in the real world, the world that *counts*; but she doesn’t have to win in a simulated world. That’s why the joke is funny.)

The concept of ‘slipping sideways’ seemed really important to me, and seemed to come up in multiple other places. More recently, I’ve felt a sense of slipperiness as I watch some people argue about various x-risk related topics. And I sometimes see people make plans that don’t make sense, in ways that are hard to pin down.

Lately I’ve been wanting to get more clarity on this for a few reasons:

  • First, I probably slip sideways sometimes, and I want to get better at noticing that, and not doing so.

  • Second, I want to notice when other people are slippery, so that I can trust them ‘an appropriate amount’, which may mean filtering them out of some hiring processes.

  • Third, ideally when I notice someone being epistemically slippery I’d like to communicate with them about it.

Okay, what are some other examples

I can’t actually think of examples that have exactly the same quality as the Trump/​Simulation thing. But when I query for examples of people who have felt ‘slippery’ in some way to me in the past, I get:

The ideologue who doesn’t notice themselves shifting goalposts, imposing doublestandards, or generally notice that they’re “fighting for a side” rather than truthseeking.

The young idealist who wants to “make a difference”, but ends up trying to justify a strategy to themselves that seems to be a confused mess of “make use of the college degree they already started”, “feel vaguely good about themselves”, “be prestigious”, “make their parents proud”. Their plan is sorta reasonable, but when you try to ask them about what they’re actually trying to achieve, their reasoning feels subtly off. If you ask “why are you doing X instead of Y?” they somehow… slide off the question.

The startup founder who’s really excited about their plan/​mission and keeps talking about it. And their plan… doesn’t really make sense, and somehow manages to make it seem like each new fact supports their existing mission even if it feels like it should be falsifying it.

The angry guy conflating things, who is upset at Alice for doing what he perceived as violation of an important norm. But he doesn’t separate out his observations about Alice from his judgments and assumptions about Alice, and then presents that mishmash as if it were an objective truth.

The manipulative negotiator who is deliberately vague about what they’re offering and what benefits you’ll receive, and when you try to pin them down they say things that superficially sound like they’re saying something concrete, but when you go back and check the record you realize they were still kinda weaseling out of committing to anything.

The manipulative guru who keeps responding your concerns with vaguely reasonable true-ish sounding things, that nonetheless leave you feeling painted into a corner.

The guy who’s really stuck in their ontology, where you to make an argument, but it doesn’t fit into their frame and they keep rounding the things you’re trying to say to either the nearest concept in their own frame. Or, they conclude you’re talking nonsense and don’t bother listening.

The guy who’s stuck in their ontology (while trying to solve their own problems). Same as previous, but even when they’re really motivated to solve their own problem, without arguing with anyone (or maybe talking to a close friend they’re earnest trying to listen to), they still keep missing concepts that don’t fit into their preconceptions.

The strangely disinterested guy, who you tell something pretty important that you think should be really relevant to their worldview (but maybe implies they’re missing something important), and they… just don’t seem that interested.

Why call it “Slipperiness?”

Okay, so I listed a bunch of examples. Are they actually the same phenonemon? The examples contain a bunch of rationality-errors, but maybe those are just a bunch of disparate errors and “epistemic slipperiness” is just “not being very good at rationality in a bunch of unrelated domains.”

Eliezer’s example was specifically about resolving internal mental pressures. Is that what all of them are about? Is that sufficient or useful?

A Noticing-Handle

I think I ended up writing this post because I’ve observed a bunch of situations that ‘felt slippery’ to me, and ‘feeling slippery’ was a noticing-handle I could explore, and then build a habit off of. (i.e. ‘notice feeling of slipperiness’ ->> ‘do something on purpose to reduce the problems of the slipperiness’)

Upon reflection, I think ‘slipperiness’ is mostly a feeling I get around other people.

When I’m slipping, the feeling is more specific. When I’ve been the ideologue, or idealist with the confused plan, or the guy stuck-in-his-ontology, those each feel differently from the inside. It feels like defensiveness, or fear, or righteously doing the right thing.

But when I’m talking to someone else, and I keep trying to point at some reasonable concept, and they keep missing it, and then persist in missing even after I’ve tried to point it out multiple ways, that gives me a slippery sensation.

Taxonomy of Slipperiness

Some distinctions that come up when I look over each of the above examples, and reflect on where slipperiness has come up in my life.

  • Invisible rationality errors that you completely fail to notice.

  • Persistent, reinforced rationality errors that maybe you do notice, or someone points them out, but then even when it’s pointed out you still have trouble seeing it.

  • Communication failures. These usually involve at least one person making some kind of “internal rationality-error”, but I think there’s then some additional failures-that-feel-slippery where one person doesn’t quite know how to communicate, and another person doesn’t know how to listen.

I think these are mostly different phenomena.

For me, the “trump/​simulation” thing was invisible, but not persistent. When I read the FB post, I was like “oh, I totally might have made the trump/​simulation joke, and not noticed that it was pointing at an important sideways slide into non-reality, and yes, upon reflection it totally was an important sideways slide to avoid living in an uncomfortable reality.” But, as soon as it was pointed out, I was like “oh, yeah that makes sense.”

By contrast – I’ve sometimes been “the angry guy conflating observations with assumptions.” And then someone points that out, and then I’m either like “nu-uh” or like “I dunno, maybe, but you’re being unreasonable” and not really hearing the thing. I’m in the middle of an argument, and much of my cognition is geared towards “winning”. I’m incentivized not to notice the rationality-error that I’m making. And it’s not until I’m outside the current conflict that I’m able to look at it clearly and thoughtfully. The failure is persistent.

Finally, there are moments I perceive ‘slipperiness’, where the main ingredient seems to be a joint-failure-to-communicate. This particularly comes up with frame/​ontology disagreements, where one person is trying to “communicate in their native language”, and the other person just has a totally different native language.

Depth in persistent slipperiness.

One type of slipping is to kinda notice the type of error you’re making, and then convince yourself that you now “get it”, when in fact you only got a pale shadow of the error you’re making.

Imagine getting into a fight with your spouse. They’re upset that you got them the wrong kind of birthday cake. You might think “oh I get it, they really care about cake. Sorry for messing that up.” But, actually, what they’re upset about is that they mentioned their preferred cake flavor recently, and you didn’t pay attention and remember. Wait, no, actually the problem is that you’ve been not paying that much attention to their preferences for months/​years, and the cake situation just happened to be the straw that broke the camel’s back.

And meanwhile you have a self-image as a good spouse, so each step along the way you’re got an incentive to think to yourself “ah, I get it, I’m paying attention and doing a good job listening”, when in fact the problem was subtler and required more active listening.

Here, the cognitive error is “skipping to the step where you’re pretty sure you understand the problem and are ready to execute on solution, before listening and fleshing out that understanding, and then doing that multiple times.” And it’s noteworthy that you can successfully gain a new taste for part of the problem (“spouse has strong opinions about cake”), which indeed will successfully predict some future problems (indeed, now whenever the topic of cake comes up they’re kinda tense because it reminds them of the last time you didn’t listen to them and now it’s kinda become a Whole Thing)


Appendix

I have a more thoughts on slipperiness, but they feel less “blogpost” shaped and more like “random thoughts” shaped. A teaser of some things I’m still thinking about:

  • How to talk to someone who seems to persistently be missing concepts, in a slippery way? In default-culture this is kinda insulting to state directly.

  • I mentioned “frame confusion” as a way you could get a “slippery communication failure.” But I also think “frame confusion” might be a source of slipperiness inside a single person. The Young Idealist With a Confused Plan might be equivocating between different types of “successful plans”, and ways of thinking about successful plans, and not noticing when they’ve made the switch.

  • Coordination and “fake plans.” Part of my motivation here is a certain kind of slipperiness that shows up in people’s x-risk plans, that I’m not sure what to do about.