“You cannot rely on anyone else to argue you out of your mistakes; you cannot rely on anyone else to save you; you and only you are obligated to find the flaws in your positions”
It wasn’t much of an “aha!” moment- when I first read it, I thought something along the lines of “Of course higher standards are possible, but if no one can find flaws in your argument, you’re doing pretty well.” but the more I thought about it, the more I realized that EY made a good point. I had later stumbled upon flaws in my long standing arguments that I had overlooked, yet no one called me on.
Not only was the standard lower than I had previously realized, but it is entirely possible for someone to 1) not believe you 2) not be able put their refutation into words, and 3) still be right.
http://www.overcomingbias.com/2008/09/refutation-prod.html
I never had a sharp transition to rationality. I have been an “aspiring rationalist” for as long as I can remember. Though there were a few significant events, it was mostly just a gradual improvement.
Now that I think of it, my upbringing seems almost ideal for creating a rationalist. My dad is probably the most rational person I know, and although my mom is normally very rational, she would occasionally get upset about something and be extremely irrational. Not only was I raised by atypically rational people, but I also had practice dealing with irrationality. The fact that the only irrationality in my genes is intermittent (when emotional) and mild may have even acted as a “vaccine”.
One of the driving forces for me to actively try to be rational (as opposed to just not letting myself be knowably stupid) was that I enjoyed being contrarian on issues where people cared and were wrong. It was enjoyable to find things that people get wound up about, think about them rationally and see what crazy sounding ideas come up (stuff like Robin’s proposals for fixing health care).
Another driving force is that I hate to lose (being wrong), so that I made sure to express uncertainty when I wasn’t certain, and changed my mind when necessary to stay on the “winning team”. It was ok to be 75% sure of something and change your mind (hell, it should happen one time in four), but when one claims p ~=1, being wrong is an obvious failure of rationality. This helped me prevent wild overconfidence at the extremes of the scale.