Holden expects us to have epistemic and instrumental powers of rationality that would make us successful in Western society, however this is a strawman. Being rational isn’t succeeding in society, but succeeding at your own goals.
But I didn’t think that what Holden got wrong was a confusion between one’s own goals and “success in Western society” goals. Many of SI’s own goals include “success in Western society” goals like lots of accumulated wealth and power. Instead, what I thought Holden got wrong was his estimate of the relation between rationality and success.
Re: the testing. LWers hadn’t trained specifically for the battery of tests given them that day, but they outperformed every other group I know of who has taken those tests. I agree that these data aren’t as useful as the data CFAR is collecting now about the impact of rationality training on measures of life success, but they are suggestive enough to support a weak, qualified claim like the one I made, that “it seems” like LWers are more rational than the general population.
It occurs to me that Holden’s actual reasoning (never mind what he said) is perhaps not about rationality per se and instead may be along these lines: “Since SI staff haven’t already accumulated wealth and power, they probably suffer from something like insufficient work ethic or high akrasia or not-having-inherited-billions, and thus will probably be ineffective at achieving the kind of extremely-ambitious goals they have set for themselves.”
It may or may not be Holden’s, but I think you’ve put your finger on my real reasons for not wanting to donate to SI. I’d be interested to hear any counterpoint.
But I didn’t think that what Holden got wrong was a confusion between one’s own goals and “success in Western society” goals. Many of SI’s own goals include “success in Western society” goals like lots of accumulated wealth and power. Instead, what I thought Holden got wrong was his estimate of the relation between rationality and success.
Right, then I (correctly, I think) took your reasoning a step farther than you did. The SI’s goals don’t necessarily correspond with its members’ goals. SIers may be there because they want to be around a lot of cool people, and may not have any particular desire for being successful (I suspect many of them do). But this discounts luck, like luck in being born conscientiousness—the power to accomplish your goals. And like I said, poor luck like that is unconvincing when applied to a group of people.
that “it seems” like LWers are more rational than the general population.
When I say “it seems”, being an unknown here, people will likely take me to be reporting an anecdote. When you, the executive director of SI and a researcher on this topic, says “it seems” I think people will take it as a weak impression of the available research. Scientists adept at communicating with journalists get around this by saying “I speculate” instead.
Your paraphrase of me was:
But I didn’t think that what Holden got wrong was a confusion between one’s own goals and “success in Western society” goals. Many of SI’s own goals include “success in Western society” goals like lots of accumulated wealth and power. Instead, what I thought Holden got wrong was his estimate of the relation between rationality and success.
Re: the testing. LWers hadn’t trained specifically for the battery of tests given them that day, but they outperformed every other group I know of who has taken those tests. I agree that these data aren’t as useful as the data CFAR is collecting now about the impact of rationality training on measures of life success, but they are suggestive enough to support a weak, qualified claim like the one I made, that “it seems” like LWers are more rational than the general population.
It occurs to me that Holden’s actual reasoning (never mind what he said) is perhaps not about rationality per se and instead may be along these lines: “Since SI staff haven’t already accumulated wealth and power, they probably suffer from something like insufficient work ethic or high akrasia or not-having-inherited-billions, and thus will probably be ineffective at achieving the kind of extremely-ambitious goals they have set for themselves.”
It may or may not be Holden’s, but I think you’ve put your finger on my real reasons for not wanting to donate to SI. I’d be interested to hear any counterpoint.
Right, then I (correctly, I think) took your reasoning a step farther than you did. The SI’s goals don’t necessarily correspond with its members’ goals. SIers may be there because they want to be around a lot of cool people, and may not have any particular desire for being successful (I suspect many of them do). But this discounts luck, like luck in being born conscientiousness—the power to accomplish your goals. And like I said, poor luck like that is unconvincing when applied to a group of people.
When I say “it seems”, being an unknown here, people will likely take me to be reporting an anecdote. When you, the executive director of SI and a researcher on this topic, says “it seems” I think people will take it as a weak impression of the available research. Scientists adept at communicating with journalists get around this by saying “I speculate” instead.