I think the only real use of such a survey will be to more precisely pin down how members of this community are using the term “rationality.” And by the construction of the survey, it seems pretty clear that what they (or you) mean by “rational” is just “whatever we are.”
AlexU
“But it’s still an interesting point that Science manages to survive not because it is in our collective individual interest to see Science get done, but rather, because Science has fastened itself as a parasite onto the few forms of large organization that can exist in our world.”
Can’t it be both? It’s not as if the government’s support of science is completely unrelated to its being in our collective interest. It’s awfully cynical simply to presume, without comment, that governments don’t ultimately work for our collective interest (even if they can at times be very misguided).
More generally, this post seems full of lots of lines, like the above, that sort of seem true at first glance, but upon closer inspection are either quite banal, vacuous, or only questionably correct. How about some supporting evidence or arguments?
I’ve noticed that many of the meta-posts here are premised on the belief that being interested in rationality (or being interesting in identifying oneself with a community of “rationalists”) is strongly correlated with actually being more rational. I wouldn’t be surprised if this were indeed the case, as compared to average; however, I’d be very hesitant to simply assume that people here are “atypically rational” as compared to, say, another Internet community comprising similarly high-IQ individuals, but with no special interest in rationality per se.
Good points. I share your concern. But it’s not clear which direction rationality cuts in this case. If I have no special attachment to the “me” of one year from now, why should I sacrifice present interests for his? On the other hand, I’ve been wondering recently if it’s possible to salvage our folk concept of identity by positing that, while “me” at T2 might not be “me” in any robust sense, 1). there will be a person (or locus of consciousness, if you will) at T2 who thinks he’s me, and shares many of my memories and behavioral predispositions, and 2). that person will be disproportionately influenced by my actions today. I think it follows from ethical considerations, then, if not prudential ones, that I should act today in a way that is in keeping with my best interests, so as not to unduly harm that future person.
Now, what would really be interesting would be if we discovered that the “rational” thing to do would be some averaging of the two extremes—i.e., I continue to act generally in my future best interests, but also prioritize present and near-term happiness to a much greater degree than seems naively appropriate.
Your postscript raises an interesting point. I strongly suspect that readers here can have a much greater impact on their real world success by improving arational traits like charisma and physical appearance than by continuing to strive for marginal gains in what are likely to be already-high levels of rationality. At the very least, it seems uncontroversial to say that these traits play a huge role in one’s real world success. If we assume, then, that “real world success” is a rational objective, why isn’t everyone here hitting the gym daily, working to improve their fashion sense, and enrolling in acting classes to improve social finesse?
One obvious implication of this is that we should be making our homes in warmer climates. Even if you, personally, have high resistance to foul weather, it’s going to be tougher to get people to walk and converse with you year-round in Boston than it would be in Miami.
This conflicts with the observation that, at least in modern times, the colder parts of the world have tended to produce the better thinkers. I’m not sure it would be smart to move from Cambridge to South Beach in hopes of leading a more intellectually fruitful life...
Rationality has its limits. We all know that daily exercise is good for us, and that it’s something we should be doing. It’s pretty clearly the “rational” choice. But can rationality actually get us to exercise every day? Is there some further bias we can eliminate that will enable us to drag our asses to the gym even when we’re feeling completely exhausted? I doubt it—there’s just nothing much more that rationality can do for us in that department. A related (and rhetorical) question: are fat people fat because they’re rationally deficient in some sense? We need to be careful not to downplay the extremely powerful and seemingly ineradicable influences of emotion and subjective experience (urges, fatigue, impulses, etc.) in our day-to-day decision-making.
Yes. I’ve been a semi-regular reader of OCB for about a year. I think it’s an interesting blog. But have I learned anything useful from it? Has it made any practical difference in the choices I make, either day-to-day or longterm? The answer is no. Admittedly, this may be my own fault. But I recall a post, not too long ago, soliciting people’s feedback on “the most important thing you learned from OCB in the past year,” or something of that sort. And while there were lots of people excitedly posting about how much OCB has taught them, the examples they gave were along the lines of “I learned the power of fundamental attribution error!” or “I learned the importance of continually adjusting my priors!” with curiously few examples of real differences OCB made in anyone’s practical choices. This raises the question: if tweaking our rationality has no appreciable affect on anything, then how can we say we’re really tweaking our rationality at all? Perhaps we’re just swapping new explanations for fundamentally irrational processes that are far too buried and obscure to be accessible to us.
That said, I think things like the recent posts on akrasia are strong moves in the right direction. Intellectually interesting, but with easy to grasp real-world implications.
My point was just that knowing what to do and actually doing it are two separate things. It’s possible that someone could come to the objectively rational conclusion in every single circumstance, yet fail to act on those conclusions for a variety of other reasons. In that case, it would very tough to say their rationality is in any way at fault.
“Rationalism” as compared to what? Mysticism? Random decision-making? Of course rational behavior is going to be by far the best choice for achieving one’s particular ends. I wasn’t questioning the entire concept of rationalism, which clearly has been the driving force behind human progress for all of history. I was questioning how much making “tweaks”—the kind discussed here and on OCB—can do for us. Or have done for us. Put differently, is perseverating on rationality per se worth my time? Can anyone show that paying special attention to rationality has measurable results, controlling for IQ?
This sounds an awful lot like one of the examples I gave above. Ok, so you’re focused on “risk reduction” and “reducing akrasia.” So what’s that mean? You’ve decided to buckle-up, wear sunscreen, and not be so lazy? Can’t I get that from Reader’s Digest or my mom?
Yes, but these are things most reasonably intelligent people know, or figure out, anyway. It seems correct to chalk up these insights to rationality, but trivially so. I don’t see what extra work studying rationality per se would be doing for us here.
The latter.
Write shorter posts. Write in a simpler and less oracular prose style. And write more substantive posts—at times, it seems as if you believe your every passing thought deserves 2,000 words. I’ll often read your posts and, while recognizing some germ of a worthwhile idea there, regret the time and effort it took to locate it.
An easy way to see when your comments have been replied to, and to read those replies, would be great. Reddit has this feature. Right now I’m unaware of any way to do this on LW besides checking each of the individual parent posts.
I have trouble seeing why radical honesty should be seen as a virtue by default. It’s fairly clear that radical honesty doesn’t necessarily promote happiness. From a utilitarian perspective, then, it should be value-neutral.
I personally place a high value on having true beliefs. More than most people. However, I’m not sure I’d value true beliefs over my own happiness. If I were a devout Christian, for example, and derived a great amount of comfort from my faith, I’m not sure I’d want someone to convince me otherwise. Given that most people will value true beliefs even less than I do, I’d find it even harder to justify convincing others of God’s non-existence. That’s imposing my own value judgments upon others, often to the detriment of their happiness. Studies have shown that depressed people are more likely to have accurate beliefs than happy people. If there’s a causal connection between the two, what are we to make of radical honesty?
Similarly, one also finds that practicing radical honesty in the social sphere is unlikely to win one many friends, and will in fact piss a lot of people off. Little white lies are what grease many of our most important social interactions. What’s to be gained by a policy of radical honesty in that domain?
Radical honesty is a chief virtue in science and academia, of course; maybe the chief virtue. But to apply that norm to the world at large is to ignore basic facts of human psychology and social interaction.
Anyone care to explain why this comment (and for that matter, the one below) was downvoted? Given that my karma score just dropped about 10 points in under an hour, I can only assume someone is going through my history and downvoting me for some reason. Great use of the karma system.
All else being equal, shouldn’t rationalists, almost by definition, win? The only way this wouldn’t happen would be in a contest of pure chance, in which rationality could confer no advantage. It seems like we’re just talking semantics here.
Whatever it is you want to do with your life. I can’t think of many fields in which a rational outlook wouldn’t be of use. This goes back to fundamental values, interests, talents, etc. -- the dictates of rationalism can’t decide everything for you.
Yes. Rationalism shouldn’t be see as a bag of discrete tricks, but rather, as the means for achieving any given end—what it takes to do something you want to do. The particulars will vary, of course, depending on the end in question, but the rational individual should do better at figuring them out.
On a side note, I’m not sure coming up with better slogans, catchphrases, and neologisms is the right thing to be aiming for.
None of these questions directly measure “rationality” in any real way. The most you’re going to glean from such data will be a demographic slice of the readership of LW/OCB. Now, I imagine this group will be somewhat more “rational” than average (if only due to higher “g,” and the presumed correlation between the two; I wouldn’t even be so quick to ascribe greater rationality to people professing an interest in the concept), but it’s patently silly to equate “being a reader of LW/OCB” with “possessing the quality of rationality.”