Epistemic Effort

Epistemic Effort: Thought seriously for 5 minutes about it. Thought a bit about how to test it empirically. Spelled out my model a little bit. I’m >80% confident this is worth trying and seeing what happens. Spent 45 min writing post.

I’ve been pleased to see “Epistemic Status” hit a critical mass of adoption—I think it’s a good habit for us to have. In addition to letting you know how seriously to take an individual post, it sends a signal about what sort of discussion you want to have, and helps remind other people to think about their own thinking.

I have a suggestion for an evolution of it—“Epistemic Effort” instead of status. Instead of “how confident you are”, it’s more of a measure of “what steps did you actually take to make sure this was accurate?” with some examples including:

  • Thought about it musingly

  • Made a 5 minute timer and thought seriously about possible flaws or refinements

  • Had a conversation with other people you epistemically respect and who helped refine it

  • Thought about how to do an empirical test

  • Thought about how to build a model that would let you make predictions about the thing

  • Did some kind of empirical test

  • Did a review of relevant literature

  • Ran an Randomized Control Trial

[Edit: the intention with these examples is for it to start with things that are fairly easy to do to get people in the habit of thinking about how to think better, but to have it quickly escalate to “empirical tests, hard to fake evidence and exposure to falsifiability”]

A few reasons I think this (most of these reasons are “things that seem likely to me” but which I haven’t made any formal effort to test—they come from some background in game design and reading some books on habit formation, most of which weren’t very well cited)

  • People are more likely to put effort into being rational if there’s a relatively straightforward, understandable path to do so

  • People are more likely to put effort into being rational if they see other people doing it

  • People are more likely to put effort into being rational if they are rewarded (socially or otherwise) for doing so.

  • It’s not obvious that people will get _especially_ socially rewarded for doing something like “Epistemic Effort” (or “Epistemic Status”) but there are mild social rewards just for doing something you see other people doing, and a mild personal reward simply for doing something you believe to be virtuous (I wanted to say “dopamine” reward but then realized I honestly don’t know if that’s the mechanism, but “small internal brain happy feeling”)

  • Less Wrong etc is a more valuable project if more people involved are putting more effort into thinking and communicating “rationally” (i.e. making an effort to make sure their beliefs align with the truth, and making sure to communicate so other people’s beliefs align with the truth)

  • People range in their ability /​ time to put a lot of epistemic effort into things, but if there are easily achievable, well established “low end” efforts that are easy to remember and do, this reduces the barrier for newcomers to start building good habits. Having a nice range of recommended actions can provide a pseudo-gamified structure where there’s always another slightly harder step you available to you.

  • In the process of writing this very post, I actually went from planning a quick, 2 paragraph post to the current version, when I realized I should really eat my own dogfood and make a minimal effort to increase my epistemic effort here. I didn’t have that much time so I did a couple simpler techniques. But even that I think provided a lot of value.

Results of thinking about it for 5 minutes.

  • It occurred to me that explicitly demonstrating the results of putting epistemic effort into something might be motivational both for me and for anyone else thinking about doing this, hence this entire section. (This is sort of stream of conscious-y because I didn’t want to force myself to do so much that I ended up going ‘ugh I don’t have time for this right now I’ll do it later.’)

  • One failure mode is that people end up putting minimal, token effort into things (i.e. randomly tried something on a couple doubleblinded people and call it a Randomized Control Trial).

  • Another is that people might end up defaulting to whatever the “common” sample efforts are, instead of thinking more creatively about how to refine their ideas. I think the benefit of providing a clear path to people who weren’t thinking about this at all outweights people who might end up being less agenty about their epistemology, but it seems like something to be aware of.

  • I don’t think it’s worth the effort to run a “serious” empirical test of this, but I do think it’d be worth the effort, if a number of people started doing this on their posts, to run a followup informal survey asking “did you do this? Did it work out for you? Do you have feedback.”

  • A neat nice-to-have, if people actually started adopting this and it proved useful, might be for it to automatically appear at the top of new posts, along with a link to a wiki entry that explained what the deal was.

Next actions, if you found this post persuasive:

Next time you’re writing any kind of post intended to communicate an idea (whether on Less Wrong, Tumblr or Facebook), try adding “Epistemic Effort: ” to the beginning of it. If it was intended to be a quick, lightweight post, just write it in its quick, lightweight form.

After the quick, lightweight post is complete, think about whether it’d be worth doing something as simple as “set a 5 minute timer and think about how to refine/​refute the idea”. If not, just write “thought about it musingly” after Epistemic Status. If so, start thinking about it more seriously and see where it leads.

While thinking about it for 5 minutes, some questions worth asking yourself:

  • If this were wrong, how would I know?

  • What actually led me to believe this was a good idea? Can I spell that out? In how much detail?

  • Where might I check to see if this idea has already been tried/​discussed?

  • What pieces of the idea might you peel away or refine to make the idea stronger? Are there individual premises you might be wrong about? Do they invalidate the idea? Does removing them lead to a different idea?