But… I said I was used to it, and remembering it being fuzzy!
(I’m not sure we understand each other, but...) Okay. But I mean, when Leibniz talks about theistic concepts, his reasoning is not very fuzzy. Insofar as smart theists use memes descended from Leibniz—which they do, and they also use memes descended from other very smart people—it becomes necessary that I am able to translate their concepts into concepts that I can understand and use my normal rationality skillz on.
I don’t think this is compartmentalization. Compartmentalization as I understand it is when you have two contradictory pieces of information about the world and you keep them separate for whatever reason. I’m talking about two different skills. My actual beliefs stay roughly constant no matter what ontology/language I use to express them. Think of it like Solomonoff induction. The universal machine you choose only changes things by at most a constant. (Admittedly, for humans that constant can be the difference between seeing or not seeing a one step implication, but such matters are tricky and would need their own post. But imagine if I was to try to learn category theory in Russian using a Russian-English dictionary.) And anyway I don’t actually think in terms of theism except for when I either want to troll people, understand philosophers, or play around in others’ ontologies for kicks.
I am not yet convinced that it isn’t misplaced, but I do thank you for your concern.
(I’m not sure we understand each other, but...) Okay. But I mean, when Leibniz talks about theistic concepts, his reasoning is not very fuzzy. Insofar as smart theists use memes descended from Leibniz—which they do, and they also use memes descended from other very smart people—it becomes necessary that I am able to translate their concepts into concepts that I can understand and use my normal rationality skillz on.
I don’t think this is compartmentalization. Compartmentalization as I understand it is when you have two contradictory pieces of information about the world and you keep them separate for whatever reason. I’m talking about two different skills. My actual beliefs stay roughly constant no matter what ontology/language I use to express them. Think of it like Solomonoff induction. The universal machine you choose only changes things by at most a constant. (Admittedly, for humans that constant can be the difference between seeing or not seeing a one step implication, but such matters are tricky and would need their own post. But imagine if I was to try to learn category theory in Russian using a Russian-English dictionary.) And anyway I don’t actually think in terms of theism except for when I either want to troll people, understand philosophers, or play around in others’ ontologies for kicks.
I am not yet convinced that it isn’t misplaced, but I do thank you for your concern.