I can’t trace my present efforts at rationality back to one “Aha” moment; and trying to do so feels akin to applying the Sorites paradox to subjective experience: lots of problems there. But, for what it’s worth, I remember certain events and thoughts I associate with “breakthroughs”—spans of time after which, I became more eager and aware of my own biases.
Here are a few that I remember:
Like many other people, confronting my religious beliefs was a milestone. I’d grown up Roman Catholic, and as a child Christian myth and metaphysics excited my imagination. As I encountered other belief systems I found interesting I tended to engage in apologetics (aka feeding my confirmation bias). Through people I respected in the martial arts I was introduced to aspects of Buddhism and Taoism that seemed, to me, to have some truth to them. Maybe this is akin to what Robin Hanson describes: I wanted to bridge the gap between social groups that I liked. Internally, I began to adjust my religious beliefs to be looser, more “mystical”, less dogmatic, to accommodate the beliefs of other people. The big breakthrough happened while taking survey course in Western literature that included readings in Judaic and Christian texts. Looking at these texts from a strictly literary perspective had a big effect on me. I panicked and read The Case for Christ, but in the end I concluded that a strictly literary perspective on the Bible was the really most valuable way to actually engage with “The Bible” if you’re actually searching for truth. In my head I saw a thousand exegetical scholars and apologists spread across history, all frantically waving their hands.
“How dangerous is self preserving belief,” I thought, staring down at the tracks, waiting for the downtown A train at 34th Street. “And how utterly comfortable.” I felt immensely alone in that moment, scared about having to confront the people I care about and their treasured beliefs, and say, “You’re wrong.”
An experience last year made the idea how biases can just friggin screw things up much more apparent to me. I had a friend and mentor I admired as one of the most a) intelligent and b) altruistic people I’d ever met. In short, what happened was she accused me of doing something bad to her that I did not do. I didn’t hear this from her directly: she just stopped talking to me, and I had to really bug our mutual friends. What was she had accused me of doing was utterly ridiculous, but I understood that it would be nearly impossible to convince her otherwise. My friend is the kind of person who makes negative conclusions about people with immense consternation, something I used to think was a virtue. But once she had decided I, an important and close person, had done something bad, no level of discussion could convince her otherwise. She could muster the equivalent of a thousand apologists to defend her existing belief. (Example of intelligent people shooting themselves in the foot.) Aside from the fact that I had just lost a very dear and important friend, I was angry, so angry that someone so good and smart could make such a fatal error. We talk about cognitive biases in public policy, in global catastrophic risk, as an obstacle to human progress and knowledge. But here I experienced a very dramatic and personal example of irrationality’s consequences. Likely she’ll go on the rest of her life with the belief that a close friend of hers had betrayed her. I do think that avoiding the destruction of the world, and preventing the purposeless deaths of all people is a more important to study rationality. This was just an up-close reminder to me that the dangers of irrationality are here, now, and devastating consequences do lie in wait. I wish I didn’t need such an experience, and I know should be careful with hos it influences my beliefs and actions in the future. Robin Hanson’s point is especially relevant here when he asks if our transition to rationality was rational. This was a very emotional reaction to a bad occurrence. Yet it is what, at least initially, increased my desire to be, shall I say it, Less Wrong.
I can’t trace my present efforts at rationality back to one “Aha” moment; and trying to do so feels akin to applying the Sorites paradox to subjective experience: lots of problems there. But, for what it’s worth, I remember certain events and thoughts I associate with “breakthroughs”—spans of time after which, I became more eager and aware of my own biases.
Here are a few that I remember:
Like many other people, confronting my religious beliefs was a milestone. I’d grown up Roman Catholic, and as a child Christian myth and metaphysics excited my imagination. As I encountered other belief systems I found interesting I tended to engage in apologetics (aka feeding my confirmation bias). Through people I respected in the martial arts I was introduced to aspects of Buddhism and Taoism that seemed, to me, to have some truth to them. Maybe this is akin to what Robin Hanson describes: I wanted to bridge the gap between social groups that I liked. Internally, I began to adjust my religious beliefs to be looser, more “mystical”, less dogmatic, to accommodate the beliefs of other people. The big breakthrough happened while taking survey course in Western literature that included readings in Judaic and Christian texts. Looking at these texts from a strictly literary perspective had a big effect on me. I panicked and read The Case for Christ, but in the end I concluded that a strictly literary perspective on the Bible was the really most valuable way to actually engage with “The Bible” if you’re actually searching for truth. In my head I saw a thousand exegetical scholars and apologists spread across history, all frantically waving their hands.
“How dangerous is self preserving belief,” I thought, staring down at the tracks, waiting for the downtown A train at 34th Street. “And how utterly comfortable.” I felt immensely alone in that moment, scared about having to confront the people I care about and their treasured beliefs, and say, “You’re wrong.”
An experience last year made the idea how biases can just friggin screw things up much more apparent to me. I had a friend and mentor I admired as one of the most a) intelligent and b) altruistic people I’d ever met. In short, what happened was she accused me of doing something bad to her that I did not do. I didn’t hear this from her directly: she just stopped talking to me, and I had to really bug our mutual friends. What was she had accused me of doing was utterly ridiculous, but I understood that it would be nearly impossible to convince her otherwise. My friend is the kind of person who makes negative conclusions about people with immense consternation, something I used to think was a virtue. But once she had decided I, an important and close person, had done something bad, no level of discussion could convince her otherwise. She could muster the equivalent of a thousand apologists to defend her existing belief. (Example of intelligent people shooting themselves in the foot.) Aside from the fact that I had just lost a very dear and important friend, I was angry, so angry that someone so good and smart could make such a fatal error. We talk about cognitive biases in public policy, in global catastrophic risk, as an obstacle to human progress and knowledge. But here I experienced a very dramatic and personal example of irrationality’s consequences. Likely she’ll go on the rest of her life with the belief that a close friend of hers had betrayed her. I do think that avoiding the destruction of the world, and preventing the purposeless deaths of all people is a more important to study rationality. This was just an up-close reminder to me that the dangers of irrationality are here, now, and devastating consequences do lie in wait. I wish I didn’t need such an experience, and I know should be careful with hos it influences my beliefs and actions in the future. Robin Hanson’s point is especially relevant here when he asks if our transition to rationality was rational. This was a very emotional reaction to a bad occurrence. Yet it is what, at least initially, increased my desire to be, shall I say it, Less Wrong.