Leave beliefs that don’t constrain experience alone

Epistemic status: This is the first time I’ve expressed these thoughts. I’ve thought for a long time that people do their jobs well, and are numbskulls in every other area of life. Here I say that it’s OK to be a numbskull.

I read Raising the Sanity Waterline some time ago. I thought, “These are great points! I’ve needed them!” I made arguments that used those points a few times.

When I listened to the Bayesian Conspiracy's episode on it I thought, “How did BC get this article so wrong? RtSW isn’t about making oblique attacks on religion by teaching people things like Occam’s Razor!”

It is about that!

I think I took these sentences and made a different conclusion:

Behind every exciting, dramatic failure, there is a more important story about a larger and less dramatic failure that made the first failure possible.
The general sanity waterline is currently really ridiculously low. Even in the highest halls of science.

I wrongly took this article to mean (actually, this is what I believe):

You can be a Nobel prize winning scientist and believe in God. The sanity waterline is so low that having many, many irrational beliefs won’t impact your ability to contribute to the world and be happy. Why is that? One hypothesis is if you have just a few beliefs that pay rent you’re doing better than most. Just as the difference between rationalist and rationalist-adjacent people is small, the difference between an imagined highly-effective scientist and a highly-effective scientist who believes in God is small. In general, the effect of adding another belief that pays rent is logarithmic.
If this bothers you, your options are limited. It’s not effective to relieve people of their beliefs in God, because the low sanity waterline predicts that they’ll have many other low-water beliefs. And beliefs don’t have to pay rent to stick around.
But you can embark on a more ambitious project: raise the sanity waterline. Do this by making a more rational world. The world which is so effective-and-rational that a belief in god impedes your progress is a rational world, indeed.

I don’t think that Nobel prize-winning scientists don’t understand “epistomology 101.” That scientist needs to have epistomology 101 pay rent! Not that you asked, but the broken coupling of professional achievement with assumed rationality makes it hard for me to tell others, “Join us! We win!”

Is there a term for “less-examined beliefs that don’t have to pay rent for you to happily contribute in the way you like?” To take myself as an example, I do not examine my belief in anthropogenic climate change. I have never read a paper on it. I never will. I trust everyone around me (I live in a liberal town), and I do what they tell me.

I don’t see a problem with avoiding investigating climate change myself, because I won’t change my behavior or thoughts if I decide climate change is true or false. The amount of effort I put toward climate-change-avoiding stuff (recycling… TK list other things), is determined by social pressure.

I’d hazard that beliefs stay less-examined for a boring reason: they can. Beliefs get examined because they have to be.