One reduction of “compartmentalization” is “failure to notice, or act on, inconsistent beliefs and/or desires”.
For example, in G.E.B., Douglas Hofstader describes a situation in which he found himself bored while driving, so he attempted to turn on his broken radio. Another time, he and his wife, when traveling in Paris, decide that they want a hotel room on the opposite side of the building from the American embassy because they’re concerned about terrorism, but when asked if they want a room with a better view of the embassy gardens, accept it—and only later realize what they ended up doing.
Hofstatder goes on to state that the question “Do my beliefs imply a contradiction?” is NP-hard even in the restricted case in which all your beliefs can be expressed as statements in propositional logic; as you add more beliefs to a system, the number of ways to potentially deduce a contradiction goes up exponentially—and, therefore, some form of compartmentalization is mathematically inevitable because nothing can do the 2^zillion computations necessary to check all the beliefs in a human’s head for consistency.
Of course you get to choose which beliefs go into your head to begin with. You can avoid contradictions by only putting in beliefs that are logically implied by ones you already have, for example.
Not necessarily. If you slowly develop towards logical omnisicence, you’ll only accept beliefs implied by your existing ones, but you will believe some new things. You will update your beliefs on new implications from current beliefs rather than evidence, sure, but that’s not such a weird concept—it ran strongly through the schools of analytic philosophy for a long time.
Right, which is why I said “for example”. My point is simply that there are many fewer contradiction between our beliefs than CronoDAS’ comment would suggest, since our beliefs are somewhat formed by processes that make them coherent.
One reduction of “compartmentalization” is “failure to notice, or act on, inconsistent beliefs and/or desires”.
For example, in G.E.B., Douglas Hofstader describes a situation in which he found himself bored while driving, so he attempted to turn on his broken radio. Another time, he and his wife, when traveling in Paris, decide that they want a hotel room on the opposite side of the building from the American embassy because they’re concerned about terrorism, but when asked if they want a room with a better view of the embassy gardens, accept it—and only later realize what they ended up doing.
Hofstatder goes on to state that the question “Do my beliefs imply a contradiction?” is NP-hard even in the restricted case in which all your beliefs can be expressed as statements in propositional logic; as you add more beliefs to a system, the number of ways to potentially deduce a contradiction goes up exponentially—and, therefore, some form of compartmentalization is mathematically inevitable because nothing can do the 2^zillion computations necessary to check all the beliefs in a human’s head for consistency.
Of course you get to choose which beliefs go into your head to begin with. You can avoid contradictions by only putting in beliefs that are logically implied by ones you already have, for example.
What? Were you ever a child?
If you only accept beliefs that are implied by your existing ones, you’ll never believe anything new. And as such, you’ll stop updating your beliefs.
Not necessarily. If you slowly develop towards logical omnisicence, you’ll only accept beliefs implied by your existing ones, but you will believe some new things. You will update your beliefs on new implications from current beliefs rather than evidence, sure, but that’s not such a weird concept—it ran strongly through the schools of analytic philosophy for a long time.
Right, which is why I said “for example”. My point is simply that there are many fewer contradiction between our beliefs than CronoDAS’ comment would suggest, since our beliefs are somewhat formed by processes that make them coherent.