(ii) Adopt the stance of rebel: There is nothing that plays worse in our culture than seeming to be the stodgy defender of old ideas, no matter how true those ideas may be. Luckily, at this point the orthodoxy of the academic economists is very much a minority position among intellectuals in general; one can seem to be a courageous maverick, boldly challenging the powers that be, by reciting the contents of a standard textbook. It has worked for me!
You can shock many people by doing some rational things—those preselected for not being done by most people already, and also those that are explicitly counter to important irrational things that many people do. And these specific rational actions have an availability bias. Conversely, once something is “normal”, it’s not a highly available mental example of “especially rational”.
But can you really shock many people by doing a randomly selected rational thing? By giving the right answer on a test? By choosing the deal that gains you the most money? By choosing a profession, a friend, a place to live, based on expectations of happiness? By choosing medical treatment based on scientific evidence? By doing something because it’s fun?
It might shock people that the choice is in fact rational; they may disagree that the deal you chose will earn you the most money. But when people agree about predictions, why would they be shocked by most rational choices? I think a random (but doable) irrational act is much more shocking than a random rational one.
You are correct, but I just want to point out that the original quote talks about distinguishing yourself, not shocking people. And I think most of what you said still applies.
Sometimes, yes, but only along certain dimensions. If your group performs rituals, they can’t be rational because then they will be the same as other groups’. For example, the Jewish practice of eating flat bread on Passover is arbitrary [1], but it only works because it is arbitrary.
[1] It’s not entirely arbitrary if you believe the story of Passover, but that’s a somewhat different point. Actually, it may be interesting to examine whether it’s rational in that case—I can see arguments for both sides.
Interestingly, group rituals purely for the sake of group bonding needn’t be irrational. It’s irrational to believe that God is going to punish you if you eat leavened bread during Passover—I am caricacturizing Jewish theology here but the general point is sound—but it can be useful to set a test for group membership, or an action to marks you as part of a group, to help group cohesion. This is particularly useful if you’re up against other groups that would like to exploit you and you need as much help as possible to stay together so your group can put up a united front. Arbitrary dietary restrictions seem like a decent way to do that.
Not that anyone actually sat down and thought it out like this before deciding that Jews should abstain from leavened bread for a week every spring, or that Mormons shouldn’t drink alchohol, and so on. But I think there’s value in having an arbitrary ritual explicitly for the sake of group cohesion.
Sure, but that’s a lot more difficult. There are so many arbitrary things to do, and wrong things to believe, that they’re going to be the default because they’re easy.
I’m shocked, because I expect Yudkowsky to be rational and deep.
The original quote should be read in context, where it’s almost a tautology. What shocks me is the immediate “False.”
Sure, within the definitions here, it’s false. But within the context, I don’t want to say that it’s “true,” because I don’t believe in true/false as absolutes, but it is not false. It’s only when it is converted to some general statement, by being abstracted here, that it takes on an obviously false character, because, of course, a group may be “distinguished” by “doing rational things” that differ from expectations.
Yudkowsky then goes on to recognize “grain of truth.”
The statement wasn’t made about just any group, though. It was made about what might be called “sects.” Graham considers truth to be common property. You can’t distinguish a sect, in the conversation Graham is creating, by what’s true about it, i.e., what is common among all, or among all rational thinkers. Sects—predefined group affiliations—are distinguished by the characteristic lies they tell. Or at least the characteristic stories, i.e., beliefs that are not falsifiable.
The original should be read. There is plenty to distinguish in it as to logical errors, but this statement was insightful.
False.
I mean, grain of truth, yes, literally true, no. You can shock the hell out of people and distinguish yourselves quite well by doing rational things.
Paul Krugman says something similar
(Very close to the end of Ricardo’s Difficult Idea] )
Well, it is similar insofar as “reciting the contents of a standard textbook” and “doing rational things” are similar.
Mileage varies.
Krugman’s talking about Ricardo’s Law in particular, very basic, very old, not disputed so far as I know, and not known to the general populace.
You can shock many people by doing some rational things—those preselected for not being done by most people already, and also those that are explicitly counter to important irrational things that many people do. And these specific rational actions have an availability bias. Conversely, once something is “normal”, it’s not a highly available mental example of “especially rational”.
But can you really shock many people by doing a randomly selected rational thing? By giving the right answer on a test? By choosing the deal that gains you the most money? By choosing a profession, a friend, a place to live, based on expectations of happiness? By choosing medical treatment based on scientific evidence? By doing something because it’s fun?
It might shock people that the choice is in fact rational; they may disagree that the deal you chose will earn you the most money. But when people agree about predictions, why would they be shocked by most rational choices? I think a random (but doable) irrational act is much more shocking than a random rational one.
You are correct, but I just want to point out that the original quote talks about distinguishing yourself, not shocking people. And I think most of what you said still applies.
Sometimes, yes, but only along certain dimensions. If your group performs rituals, they can’t be rational because then they will be the same as other groups’. For example, the Jewish practice of eating flat bread on Passover is arbitrary [1], but it only works because it is arbitrary.
[1] It’s not entirely arbitrary if you believe the story of Passover, but that’s a somewhat different point. Actually, it may be interesting to examine whether it’s rational in that case—I can see arguments for both sides.
Interestingly, group rituals purely for the sake of group bonding needn’t be irrational. It’s irrational to believe that God is going to punish you if you eat leavened bread during Passover—I am caricacturizing Jewish theology here but the general point is sound—but it can be useful to set a test for group membership, or an action to marks you as part of a group, to help group cohesion. This is particularly useful if you’re up against other groups that would like to exploit you and you need as much help as possible to stay together so your group can put up a united front. Arbitrary dietary restrictions seem like a decent way to do that.
Not that anyone actually sat down and thought it out like this before deciding that Jews should abstain from leavened bread for a week every spring, or that Mormons shouldn’t drink alchohol, and so on. But I think there’s value in having an arbitrary ritual explicitly for the sake of group cohesion.
Sure, but that’s a lot more difficult. There are so many arbitrary things to do, and wrong things to believe, that they’re going to be the default because they’re easy.
I’m shocked, because I expect Yudkowsky to be rational and deep.
The original quote should be read in context, where it’s almost a tautology. What shocks me is the immediate “False.”
Sure, within the definitions here, it’s false. But within the context, I don’t want to say that it’s “true,” because I don’t believe in true/false as absolutes, but it is not false. It’s only when it is converted to some general statement, by being abstracted here, that it takes on an obviously false character, because, of course, a group may be “distinguished” by “doing rational things” that differ from expectations.
Yudkowsky then goes on to recognize “grain of truth.”
The statement wasn’t made about just any group, though. It was made about what might be called “sects.” Graham considers truth to be common property. You can’t distinguish a sect, in the conversation Graham is creating, by what’s true about it, i.e., what is common among all, or among all rational thinkers. Sects—predefined group affiliations—are distinguished by the characteristic lies they tell. Or at least the characteristic stories, i.e., beliefs that are not falsifiable.
The original should be read. There is plenty to distinguish in it as to logical errors, but this statement was insightful.