How rational was your transition to rationality? A sudden transition seems more suspicious, as that looks a lot like the sudden transitions humans tend to make between social groups. After all, there is usually little social benefit to sitting between social groups; social rewards come more to those firmly within one group or another. A gradual transition, on the other hand, seems more plausibly to match the more steady rate at which relevant info arrives on such topics. How much more relevant info could you really have obtained via one story or essay? Whatever your conscious thoughts, if you had a sudden transition I’m guessing that was your subconsious mind thinking something like “Yes, this looks like a good social group to join.”
I feel that perhaps you are being too cynical. There’s such a thing as an insight snapping into place and recoding a lot of old information.
And there’s such a thing as force building up for a long time against resistance, and then the resistance breaking; this is not sane, per se, but it’s how I would describe my own sharp transition in 2003. I certainly don’t think you could describe that as joining a social group.
Actually, I’d think there would be a lot of sources for sharp mental transitions. Just having to choose locally a preference between A and B will generate sharp transitions whenever A < B swaps to B > A and that means other things have to follow.
I agree with Eliezer here, but Robin also has a point. I think we should distinguish between the transition away from one position and the transition towards another. Because falsification is relatively easier than confirmation, once the right evidence falls into place, a rationalist should expect to quickly abandon prior beliefs. The problem arises if something else quickly fills the void without being thoroughly tested. I saw a couple high school friends fall into the trap of thinking the opposite of stupidity is intelligence after leaving religion behind.
Beware a slow transition away from old beliefs as much as a sharp transition to new ones.
Yes, joining social groups isn’t the only possible cause of sudden belief changes, but since the relevant info should have been coming out pretty gradually, it is still hard to see how a sudden large belief change could be that rational. I suppose one could more suddenly see an implication of evidence one had long held, but then the suddenness should be attributed to have realized that some point of view was possible at all. A sudden move to a point of view one had already recognized as possible would harder to describe as rational.
[I also mean this comment to reply to other comments besides Eliezer’s but this system offers no easy way to express that.]
If the belief change we’re talking about is becoming more rational, then the implication is that you’ve been irrational up until that point and failing to integrate evidence.
Saying “I’ve been such an idiot!” is a further factor discriminating in this direction.
I don’t know, man. For me, joining the “rationalist” tribe only means trouble, socially speaking. It is not a tight or rewarding community, it lacks solidarity and structure, and we can’t seem to agree on suff. However, I would be fooling myself by denying that being inmersed in a European environment that actively enouraged me to not only ignore the edicts of my native religion but to also positively disbelieve in it may have had a role in making a position as a Muslim untenable. However, my change as a rationalist… in the Lesswrong meaning of the term… you could say I had always been a rationalist, ever since I was a small child. The turning point religiously was “Religion’s Claims Of Non-Disprovability”, which, like many articles here, managed to tie many loose ends that were worrying me for years and made all the puzzle snap together. What Lesswrong did was help me organize my thoughts and show me the natural conclusions of the thought threads that I was intellectually horrified of following on my own. It helped me become more myself, so to speak. So, no, mostly it wasn’t a “good social group to join” rationale, though it certainly was a deal-sweetener.
The cat sat sunning itself by the window for several hours. Then it got up and walked off. My roommate said “That’s how we can tell a cat has complex inner life—apparently uncaused but decisive action.”
Surely decisive action has more possible causes than social groups?
Seems the moral here is that humans already have a common internal mechanism for overturning belief systems that has nothing to do with rationality: outlining it completely would take me more research and probably enough space for a top-level post (it hasn’t been addressed directly as far as I can tell), but it’s related to the cult attractor and you can see suggestions of how it works in conditions like Stockholm syndrome.
Being a lot more common over the set of all humans than having accurate intuitive access to a truly rational procedure for deciding between elements of belief systems, it makes sense to consider it a more likely cause for your own decisions in the absence of evidence for such a decision procedure.
How about when one does not hold one belief to be prominently truer than another, but holds on to one consistent set of beliefs that their view of the universe and their morality are based upon, and which they cannot change gradually, because that would lead to inconsistency, and then one goes and accumulates enough evidence against the individual beliefs of one set and in favour of those of another set to decide to change sets entirely.
Religion-coded and other similar worldviews are not buffets, you either take the whole menu or nothing at all. You can’t divide it in bits the same way you would treat, for example, Marxism. It’s all or nothing.
I can’t speak with authority on his mental state at the time, but I was a participant in the online conversation during which his transition occurred, and I would say that it seemed like a click moment. He had already been grappling with religious issues and attempting to reconcile them in a rational way, and had started to make a headway on the sequences, and in the course of that conversation, he seemed to realize that there were simple and obvious reasons why the approach he had been taking didn’t make sense, that the mistakes were correctable, and that by making a generalized effort to recognize and avoid those sort of mistakes, he could make his reasoning more correct.
He already had the relevant information, but the connection was a relatively sudden event.
Edit: wrote this after seeing Raw Power’s comment in the recent comments bar without noticing the date of the comment he was responding to. I interpreted it as a question to Raw Power.
Sudden transition are suspicious and cation should be applied when switching world views. It does seem like you are jumping to conclusions in your guess:
“Whatever your conscious thoughts, if you had a sudden transition I’m guessing that was your subconsious mind thinking something like “Yes, this looks like a good social group to join.” This is a very narrow guess which you have not backed up with evidence. I also can not find supporting evidence in my own observations and/or reading. While it is a possibility it still seems likely to be an over simplification.
Knowing that a previous belief system has been falsified(by some of it’s foundation being disproven or thrown into doubt) it is perfectly fine to tentatively apply a new system of beliefs to experimental verify there correctness.
How rational was your transition to rationality? A sudden transition seems more suspicious, as that looks a lot like the sudden transitions humans tend to make between social groups. After all, there is usually little social benefit to sitting between social groups; social rewards come more to those firmly within one group or another. A gradual transition, on the other hand, seems more plausibly to match the more steady rate at which relevant info arrives on such topics. How much more relevant info could you really have obtained via one story or essay? Whatever your conscious thoughts, if you had a sudden transition I’m guessing that was your subconsious mind thinking something like “Yes, this looks like a good social group to join.”
I feel that perhaps you are being too cynical. There’s such a thing as an insight snapping into place and recoding a lot of old information.
And there’s such a thing as force building up for a long time against resistance, and then the resistance breaking; this is not sane, per se, but it’s how I would describe my own sharp transition in 2003. I certainly don’t think you could describe that as joining a social group.
Actually, I’d think there would be a lot of sources for sharp mental transitions. Just having to choose locally a preference between A and B will generate sharp transitions whenever A < B swaps to B > A and that means other things have to follow.
I agree with Eliezer here, but Robin also has a point. I think we should distinguish between the transition away from one position and the transition towards another. Because falsification is relatively easier than confirmation, once the right evidence falls into place, a rationalist should expect to quickly abandon prior beliefs. The problem arises if something else quickly fills the void without being thoroughly tested. I saw a couple high school friends fall into the trap of thinking the opposite of stupidity is intelligence after leaving religion behind.
Beware a slow transition away from old beliefs as much as a sharp transition to new ones.
Yes, joining social groups isn’t the only possible cause of sudden belief changes, but since the relevant info should have been coming out pretty gradually, it is still hard to see how a sudden large belief change could be that rational. I suppose one could more suddenly see an implication of evidence one had long held, but then the suddenness should be attributed to have realized that some point of view was possible at all. A sudden move to a point of view one had already recognized as possible would harder to describe as rational.
[I also mean this comment to reply to other comments besides Eliezer’s but this system offers no easy way to express that.]
If the belief change we’re talking about is becoming more rational, then the implication is that you’ve been irrational up until that point and failing to integrate evidence.
Saying “I’ve been such an idiot!” is a further factor discriminating in this direction.
That’s what happened in my transition.
I don’t know, man. For me, joining the “rationalist” tribe only means trouble, socially speaking. It is not a tight or rewarding community, it lacks solidarity and structure, and we can’t seem to agree on suff. However, I would be fooling myself by denying that being inmersed in a European environment that actively enouraged me to not only ignore the edicts of my native religion but to also positively disbelieve in it may have had a role in making a position as a Muslim untenable. However, my change as a rationalist… in the Lesswrong meaning of the term… you could say I had always been a rationalist, ever since I was a small child. The turning point religiously was “Religion’s Claims Of Non-Disprovability”, which, like many articles here, managed to tie many loose ends that were worrying me for years and made all the puzzle snap together. What Lesswrong did was help me organize my thoughts and show me the natural conclusions of the thought threads that I was intellectually horrified of following on my own. It helped me become more myself, so to speak. So, no, mostly it wasn’t a “good social group to join” rationale, though it certainly was a deal-sweetener.
I’ve heard a story about a cat:
The cat sat sunning itself by the window for several hours. Then it got up and walked off. My roommate said “That’s how we can tell a cat has complex inner life—apparently uncaused but decisive action.”
Surely decisive action has more possible causes than social groups?
Seems the moral here is that humans already have a common internal mechanism for overturning belief systems that has nothing to do with rationality: outlining it completely would take me more research and probably enough space for a top-level post (it hasn’t been addressed directly as far as I can tell), but it’s related to the cult attractor and you can see suggestions of how it works in conditions like Stockholm syndrome.
Being a lot more common over the set of all humans than having accurate intuitive access to a truly rational procedure for deciding between elements of belief systems, it makes sense to consider it a more likely cause for your own decisions in the absence of evidence for such a decision procedure.
It is sudden large belief changes that are suspicious, not decisive acts.
How about when one does not hold one belief to be prominently truer than another, but holds on to one consistent set of beliefs that their view of the universe and their morality are based upon, and which they cannot change gradually, because that would lead to inconsistency, and then one goes and accumulates enough evidence against the individual beliefs of one set and in favour of those of another set to decide to change sets entirely.
Religion-coded and other similar worldviews are not buffets, you either take the whole menu or nothing at all. You can’t divide it in bits the same way you would treat, for example, Marxism. It’s all or nothing.
I can’t speak with authority on his mental state at the time, but I was a participant in the online conversation during which his transition occurred, and I would say that it seemed like a click moment. He had already been grappling with religious issues and attempting to reconcile them in a rational way, and had started to make a headway on the sequences, and in the course of that conversation, he seemed to realize that there were simple and obvious reasons why the approach he had been taking didn’t make sense, that the mistakes were correctable, and that by making a generalized effort to recognize and avoid those sort of mistakes, he could make his reasoning more correct.
He already had the relevant information, but the connection was a relatively sudden event.
Edit: wrote this after seeing Raw Power’s comment in the recent comments bar without noticing the date of the comment he was responding to. I interpreted it as a question to Raw Power.
Sudden transition are suspicious and cation should be applied when switching world views. It does seem like you are jumping to conclusions in your guess: “Whatever your conscious thoughts, if you had a sudden transition I’m guessing that was your subconsious mind thinking something like “Yes, this looks like a good social group to join.” This is a very narrow guess which you have not backed up with evidence. I also can not find supporting evidence in my own observations and/or reading. While it is a possibility it still seems likely to be an over simplification.
Knowing that a previous belief system has been falsified(by some of it’s foundation being disproven or thrown into doubt) it is perfectly fine to tentatively apply a new system of beliefs to experimental verify there correctness.