I’m pretty sure I’m one of these unusual people. When I first read the litanies, I understood why they might be useful to some people (I have a lot of experience with religious fanatics), but I truly did not understand why they would be so important to Eliezer or other rationalists. I always figured they were meant to be a simple teaching tool, to help get across critical concepts and then to be discarded.
Gradually I came to realize that a large percentage of the community use the various litanies on a regular basis. This still confuses me in some cases—for example, it would never even occur to me that evidence/data could simply be ignored or that any rationalization could ever trump it.
I suspect this inability to simply ignore inconvenient data is the reason for my low rate of rationalization. I do actually catch myself beginning to rationalize from time to time, but there’s always the undercurrent of “wishful thinking isn’t real”. No matter how hard I rationalize, I cannot make the evidence go away, so the rationalization process gives up quickly.
I have been like this for most of my life, and have memories of the “wishful thinking isn’t real” effect going all the way back to my early memories of childish daydreaming and complex storytelling.
Speaking for myself, I think that rationalizing does typically (always?) involve ignoring something. Not ignoring the first piece of inconvenient data, necessarily, but the horrible inelegance of my ad-hoc auxiliary hypotheses, or such.
I’m pretty sure I’m one of these unusual people. When I first read the litanies, I understood why they might be useful to some people (I have a lot of experience with religious fanatics), but I truly did not understand why they would be so important to Eliezer or other rationalists. I always figured they were meant to be a simple teaching tool, to help get across critical concepts and then to be discarded.
Gradually I came to realize that a large percentage of the community use the various litanies on a regular basis. This still confuses me in some cases—for example, it would never even occur to me that evidence/data could simply be ignored or that any rationalization could ever trump it.
I suspect this inability to simply ignore inconvenient data is the reason for my low rate of rationalization. I do actually catch myself beginning to rationalize from time to time, but there’s always the undercurrent of “wishful thinking isn’t real”. No matter how hard I rationalize, I cannot make the evidence go away, so the rationalization process gives up quickly.
I have been like this for most of my life, and have memories of the “wishful thinking isn’t real” effect going all the way back to my early memories of childish daydreaming and complex storytelling.
This seems wrong, rationalizing is what you do to inconvenient data instead of ignoring it.
Speaking for myself, I think that rationalizing does typically (always?) involve ignoring something. Not ignoring the first piece of inconvenient data, necessarily, but the horrible inelegance of my ad-hoc auxiliary hypotheses, or such.