Moorean Statements

Moorean statements are statements like:

It’s not raining but I believe it is.

These statements sound strange because any agent that outright tells you it’s not raining must already at least tacitly represent that fact in their world model. That agent is plugged into their world model well enough to report what it says, but not well enough to accurately model their model. For this explicit a Moorean statement, the epistemic strangeness is so obvious that basically no one will have that combination of access to and confusion about their world model.

An Eliezerism my Eliezer-model often generates is that many social scripts involve expressing Moorean propositions. They’re subtler, but the essential confusion is the same.

I’m a committed Christian because my parents are—that’s just how I was raised.

Well, if intuitions aren’t epistemically admissible in philosophy, philosophers would be out of a job!

What? How can you simultaneously recognize the non-epistemic generator of your belief and hold the belief?

Can you generate more instances?