Hi ! Yep, it’s the same me, thanks for the welcome !
I don’t know if I’d call integrating knowledge THE root problem of Left Behind, which has many root problems, and a lack of integration strikes me as too high-level and widespread among humans to qualify as [i]root[/i] per se...
But yeah, good illustration of the principle :-)
(and thanks for the welcome link, I’d somehow missed that page)
Well, I was trying to think of a general rule that L&J could follow in order to repair their worldview. (Obviously we should consider following this rule as well, if we can find one.) I came up with, ‘Ask how all the facts, as you see them, fit together.’
We could probably find a better version. Eliezer suggests the rule, ‘Try to find the thought that hurts the most,’ or loosely paraphrased, ‘Ask what the Sorting Hat from Methods of Rationality would tell you.’ But for L&J such an imaginary conversation would likely end with the phrase, ‘Get behind me, Satan!’ They do not expect to have the brains to see how their facts fit together, considering this the province of God. And yet I feel like they must have some experience with judging beliefs successfully. Surely they can’t have formed a publishing empire, and survived to the point where they could hire assistants, solely by doing what ‘the proper God-appointed authorities’ told them to do? (Though they do lump airplane pilots and scientists together as ‘practical authorities.’ And we know that more belief in the wisdom of ‘traditional authority’ correlates with more gullibility about any claim the authorities in question have not condemned—see Chapter 3 here. Hmm.)
So I want to say that a personalized version of my rule would have a better effect than imagining criticisms directly. Perhaps we could tell the authors to imagine in detail what they would see if (or when) God shows them how all their facts fit together, and exactly how this would allow them to answer any and all objections. This seems connected with the act of imagining a paradise you would actually prefer to Reedspacer’s Lower Bound as a long-term home. Both involve the rule, ‘Ask how it would actually work given what you know about people/yourself.’
You might think that authors who wrote about the kingdom Jesus will establish on Earth wouldn’t need to hear these rules. You’d be wrong. :-)
“Surely they can’t have formed a publishing empire, and survived to the point where they could hire assistants, solely by doing what ‘the proper God-appointed authorities’ told them to do? ”
Dunno; I wouldn’t underestimate to what extent plain instinct can make one behave in a rational-like manner even though one’s beliefs aren’t rational. How those instincts are rationalized post-hoc, if they’re rationalized at all, isn’t that relevant.
“Perhaps we could tell the authors to imagine in detail what they would see if (or when) God shows them how all their facts fit together, and exactly how this would allow them to answer any and all objections.”
I would agree with Eliezer’s rule more than with yours here. For one thing, the issue isn’t so much that L&J aren’t following the right rationality rules; I suspect they don’t want to follow the right rationality rules. I don’t know if they haven’t realized they have to follow them to be right, or don’t care that much about being right (or to be more accurate, they’re sufficiently married to their current worldview that they don’t even want to consider it might be wrong), but I’m pretty sure if someone suggested they follow either your rule or Eliezer’s they’d just stare blankly.
So there’s that. But if we assume we managed to get them to listen to what we say, then I think Eliezer’s rule would work much better, because it’s much harder to misuse. “Ask yourself how things would actually work” is prone to rationalization, I can just picture the sentence getting translated by some brain module as “picture your current model. Nice, innit ?”.
Or, put another way, I think that the part of the brain that actually examines one’s beliefs, and the part of the brain that gives you the warm glow of self-satisfaction from being right, are not the same part of the brain. Your question will get intercepted by the warm glow part of the brain.. Eliezer’s question… will not. In fact it looks specifically designed to avoid it.
In particular, if “try to find the thought that hurts you the most” would elicit “get behind me Satan”, I’m not convinced that “try and work out how your worldview would actually work” wouldn’t have the same results. Satan is the Great Deceiver after all. How easy would it be to assume, once you meet the first contradiction, that Satan is clouding your thoughts...
Hi ! Yep, it’s the same me, thanks for the welcome !
I don’t know if I’d call integrating knowledge THE root problem of Left Behind, which has many root problems, and a lack of integration strikes me as too high-level and widespread among humans to qualify as [i]root[/i] per se...
But yeah, good illustration of the principle :-)
(and thanks for the welcome link, I’d somehow missed that page)
Well, I was trying to think of a general rule that L&J could follow in order to repair their worldview. (Obviously we should consider following this rule as well, if we can find one.) I came up with, ‘Ask how all the facts, as you see them, fit together.’
We could probably find a better version. Eliezer suggests the rule, ‘Try to find the thought that hurts the most,’ or loosely paraphrased, ‘Ask what the Sorting Hat from Methods of Rationality would tell you.’ But for L&J such an imaginary conversation would likely end with the phrase, ‘Get behind me, Satan!’ They do not expect to have the brains to see how their facts fit together, considering this the province of God. And yet I feel like they must have some experience with judging beliefs successfully. Surely they can’t have formed a publishing empire, and survived to the point where they could hire assistants, solely by doing what ‘the proper God-appointed authorities’ told them to do? (Though they do lump airplane pilots and scientists together as ‘practical authorities.’ And we know that more belief in the wisdom of ‘traditional authority’ correlates with more gullibility about any claim the authorities in question have not condemned—see Chapter 3 here. Hmm.)
So I want to say that a personalized version of my rule would have a better effect than imagining criticisms directly. Perhaps we could tell the authors to imagine in detail what they would see if (or when) God shows them how all their facts fit together, and exactly how this would allow them to answer any and all objections. This seems connected with the act of imagining a paradise you would actually prefer to Reedspacer’s Lower Bound as a long-term home. Both involve the rule, ‘Ask how it would actually work given what you know about people/yourself.’
You might think that authors who wrote about the kingdom Jesus will establish on Earth wouldn’t need to hear these rules. You’d be wrong. :-)
“Surely they can’t have formed a publishing empire, and survived to the point where they could hire assistants, solely by doing what ‘the proper God-appointed authorities’ told them to do? ”
Dunno; I wouldn’t underestimate to what extent plain instinct can make one behave in a rational-like manner even though one’s beliefs aren’t rational. How those instincts are rationalized post-hoc, if they’re rationalized at all, isn’t that relevant.
“Perhaps we could tell the authors to imagine in detail what they would see if (or when) God shows them how all their facts fit together, and exactly how this would allow them to answer any and all objections.”
I would agree with Eliezer’s rule more than with yours here. For one thing, the issue isn’t so much that L&J aren’t following the right rationality rules; I suspect they don’t want to follow the right rationality rules. I don’t know if they haven’t realized they have to follow them to be right, or don’t care that much about being right (or to be more accurate, they’re sufficiently married to their current worldview that they don’t even want to consider it might be wrong), but I’m pretty sure if someone suggested they follow either your rule or Eliezer’s they’d just stare blankly.
So there’s that. But if we assume we managed to get them to listen to what we say, then I think Eliezer’s rule would work much better, because it’s much harder to misuse. “Ask yourself how things would actually work” is prone to rationalization, I can just picture the sentence getting translated by some brain module as “picture your current model. Nice, innit ?”.
Or, put another way, I think that the part of the brain that actually examines one’s beliefs, and the part of the brain that gives you the warm glow of self-satisfaction from being right, are not the same part of the brain. Your question will get intercepted by the warm glow part of the brain.. Eliezer’s question… will not. In fact it looks specifically designed to avoid it.
In particular, if “try to find the thought that hurts you the most” would elicit “get behind me Satan”, I’m not convinced that “try and work out how your worldview would actually work” wouldn’t have the same results. Satan is the Great Deceiver after all. How easy would it be to assume, once you meet the first contradiction, that Satan is clouding your thoughts...