“Surely they can’t have formed a publishing empire, and survived to the point where they could hire assistants, solely by doing what ‘the proper God-appointed authorities’ told them to do? ”
Dunno; I wouldn’t underestimate to what extent plain instinct can make one behave in a rational-like manner even though one’s beliefs aren’t rational. How those instincts are rationalized post-hoc, if they’re rationalized at all, isn’t that relevant.
“Perhaps we could tell the authors to imagine in detail what they would see if (or when) God shows them how all their facts fit together, and exactly how this would allow them to answer any and all objections.”
I would agree with Eliezer’s rule more than with yours here. For one thing, the issue isn’t so much that L&J aren’t following the right rationality rules; I suspect they don’t want to follow the right rationality rules. I don’t know if they haven’t realized they have to follow them to be right, or don’t care that much about being right (or to be more accurate, they’re sufficiently married to their current worldview that they don’t even want to consider it might be wrong), but I’m pretty sure if someone suggested they follow either your rule or Eliezer’s they’d just stare blankly.
So there’s that. But if we assume we managed to get them to listen to what we say, then I think Eliezer’s rule would work much better, because it’s much harder to misuse. “Ask yourself how things would actually work” is prone to rationalization, I can just picture the sentence getting translated by some brain module as “picture your current model. Nice, innit ?”.
Or, put another way, I think that the part of the brain that actually examines one’s beliefs, and the part of the brain that gives you the warm glow of self-satisfaction from being right, are not the same part of the brain. Your question will get intercepted by the warm glow part of the brain.. Eliezer’s question… will not. In fact it looks specifically designed to avoid it.
In particular, if “try to find the thought that hurts you the most” would elicit “get behind me Satan”, I’m not convinced that “try and work out how your worldview would actually work” wouldn’t have the same results. Satan is the Great Deceiver after all. How easy would it be to assume, once you meet the first contradiction, that Satan is clouding your thoughts...
“Surely they can’t have formed a publishing empire, and survived to the point where they could hire assistants, solely by doing what ‘the proper God-appointed authorities’ told them to do? ”
Dunno; I wouldn’t underestimate to what extent plain instinct can make one behave in a rational-like manner even though one’s beliefs aren’t rational. How those instincts are rationalized post-hoc, if they’re rationalized at all, isn’t that relevant.
“Perhaps we could tell the authors to imagine in detail what they would see if (or when) God shows them how all their facts fit together, and exactly how this would allow them to answer any and all objections.”
I would agree with Eliezer’s rule more than with yours here. For one thing, the issue isn’t so much that L&J aren’t following the right rationality rules; I suspect they don’t want to follow the right rationality rules. I don’t know if they haven’t realized they have to follow them to be right, or don’t care that much about being right (or to be more accurate, they’re sufficiently married to their current worldview that they don’t even want to consider it might be wrong), but I’m pretty sure if someone suggested they follow either your rule or Eliezer’s they’d just stare blankly.
So there’s that. But if we assume we managed to get them to listen to what we say, then I think Eliezer’s rule would work much better, because it’s much harder to misuse. “Ask yourself how things would actually work” is prone to rationalization, I can just picture the sentence getting translated by some brain module as “picture your current model. Nice, innit ?”.
Or, put another way, I think that the part of the brain that actually examines one’s beliefs, and the part of the brain that gives you the warm glow of self-satisfaction from being right, are not the same part of the brain. Your question will get intercepted by the warm glow part of the brain.. Eliezer’s question… will not. In fact it looks specifically designed to avoid it.
In particular, if “try to find the thought that hurts you the most” would elicit “get behind me Satan”, I’m not convinced that “try and work out how your worldview would actually work” wouldn’t have the same results. Satan is the Great Deceiver after all. How easy would it be to assume, once you meet the first contradiction, that Satan is clouding your thoughts...