The cat sat sunning itself by the window for several hours. Then it got up and walked off. My roommate said “That’s how we can tell a cat has complex inner life—apparently uncaused but decisive action.”
Surely decisive action has more possible causes than social groups?
Seems the moral here is that humans already have a common internal mechanism for overturning belief systems that has nothing to do with rationality: outlining it completely would take me more research and probably enough space for a top-level post (it hasn’t been addressed directly as far as I can tell), but it’s related to the cult attractor and you can see suggestions of how it works in conditions like Stockholm syndrome.
Being a lot more common over the set of all humans than having accurate intuitive access to a truly rational procedure for deciding between elements of belief systems, it makes sense to consider it a more likely cause for your own decisions in the absence of evidence for such a decision procedure.
How about when one does not hold one belief to be prominently truer than another, but holds on to one consistent set of beliefs that their view of the universe and their morality are based upon, and which they cannot change gradually, because that would lead to inconsistency, and then one goes and accumulates enough evidence against the individual beliefs of one set and in favour of those of another set to decide to change sets entirely.
Religion-coded and other similar worldviews are not buffets, you either take the whole menu or nothing at all. You can’t divide it in bits the same way you would treat, for example, Marxism. It’s all or nothing.
I’ve heard a story about a cat:
The cat sat sunning itself by the window for several hours. Then it got up and walked off. My roommate said “That’s how we can tell a cat has complex inner life—apparently uncaused but decisive action.”
Surely decisive action has more possible causes than social groups?
Seems the moral here is that humans already have a common internal mechanism for overturning belief systems that has nothing to do with rationality: outlining it completely would take me more research and probably enough space for a top-level post (it hasn’t been addressed directly as far as I can tell), but it’s related to the cult attractor and you can see suggestions of how it works in conditions like Stockholm syndrome.
Being a lot more common over the set of all humans than having accurate intuitive access to a truly rational procedure for deciding between elements of belief systems, it makes sense to consider it a more likely cause for your own decisions in the absence of evidence for such a decision procedure.
It is sudden large belief changes that are suspicious, not decisive acts.
How about when one does not hold one belief to be prominently truer than another, but holds on to one consistent set of beliefs that their view of the universe and their morality are based upon, and which they cannot change gradually, because that would lead to inconsistency, and then one goes and accumulates enough evidence against the individual beliefs of one set and in favour of those of another set to decide to change sets entirely.
Religion-coded and other similar worldviews are not buffets, you either take the whole menu or nothing at all. You can’t divide it in bits the same way you would treat, for example, Marxism. It’s all or nothing.