I notice this isn’t showing up on the sidebar of SSC; if you want it to, consider tagging this as SSC here.
I support the opposite perspective—it was wrong to ever focus on individual winning and we should drop the slogan.
“Rationalists should win” was originally a suggestion for how to think about decision theory; if one agent predictably ends up with more utility than another, its choice is more “rational”.
But this got caught up in excitement around “instrumental rationality”—the idea that the “epistemic rationality” skills of figuring out what was true, were only the handmaiden to a much more exciting skill of succeeding in the world. The community redirected itself to figuring out how to succeed in the world, ie became a self-help group.
I understand the logic. If you are good at knowing what is true, then you can be good at knowing what is true about the best thing to do in a certain situation, which means you can be more successful than other people. I can’t deny this makes sense. I can just point out that it doesn’t resemble reality. Donald Trump continues to be more successful than every cognitive scientist and psychologist in the world combined, and this sort of thing seems to happen so consistently that I can no longer dismiss it as a fluke. I think it’s possible (and important) to analyze this phenomenon and see what’s going on. But the point is that this will involve analyzing a phenomenon—ie truth-seeking, ie epistemic rationality, ie the thing we’re good at and which is our comparative advantage—and not winning immediately.
Remember the history of medicine, which started with wise women unreflectingly using traditional herbs to cure conditions. Some very smart people like Hippocrates came up with reasonable proposals for better ideas, and it turned out they were much worse than the wise women. After a lot of foundational work they eventually became better than the wise women, but it took two thousand years, and a lot of people died in the meantime. I’m not sure you can short-circuit the “spend two thousand years flailing around and being terrible” step. It doesn’t seem like this community has.
And I’m worried about the effects of trying. People in the community are pushing a thousand different kinds of woo now, in exactly the way “Schools Proliferating Without Evidence” condemned. This is not the fault of their individual irrationality. My guess is that pushing woo is an almost inevitable consequence of taking self-help seriously. There are lots of things that sound like they should work, and that probably work for certain individual people, and it’s almost impossible to get the funding or rigor or sample size that you would need to study it at any reasonable level. I know a bunch of people who say that learning about chakras has done really interesting and beneficial things for them. I don’t want to say with certainty that they aren’t right—some of the chakras have a suspicious correspondence to certain glands or bundles of nerves in the body, and for all I know maybe it’s just a very strange way of understanding and visualizing those nerves’ behavior. But there’s a big difference between me saying “for all I know maybe...” and a community where people are going around saying “do chakras! they really work!” But if you want to be a self-help community, you don’t have a lot of other options.
I think my complaint is: once you become a self-help community, you start developing the sorts of epistemic norms that help you be a self-help community, and you start attracting the sort of people who are attracted to self-help communities. And then, if ten years later, someone says “Hey, are we sure we shouldn’t go back to being pure truth-seekers?“, it’s going to be a very different community that discusses the answer to that question.
We were doing very well before, and could continue to do very well, as a community about epistemic truth-seeking mixed with a little practical strategy. All of these great ideas like effective altruism or friendly AI that the community has contributed to, are all things that people got by thinking about, by trying to understand the world and avoid bias. I don’t think the rationalist community’s contribution to EA has been the production of unusually effective people to man its organizations (EA should focus on “winning” to be more effective, but no moreso than any other movement or corporation, and they should try to go about it in the same way). I think rationality’s contribution has been helping carve out the philosophy and convince people that it was true, after which those people manned its organizations at a usual level of effectiveness. Maybe rationality also helped develop a practical path forward for those organizations, which is fine and a more limited and relevant domain than “self-help”.
I’m a little confused. The explanation you give would explain why people might punish pro-social punishers, but it doesn’t really give insight into why they would punish cooperators. Is the argument that cooperators are likely to also be pro-social punishers? Or am I misunderstanding the structure of the game?
I agree Evan’s intentions are good, and I’m glad that someone interesting who wants to criticize my writing is getting a chance to speak. I’m surprised this is downvoted as much as it has been, and I haven’t downvoted it myself.
My main concern is with the hyperbolic way this was pitched and the name of the meetup, which I understand were intended kind of as jokes but which sound kind of creepy to me when I am the person being joked about. I don’t think Evan needs to change these if he doesn’t want to, but I do just want to register the concern.
I think it’s good and important to criticize things, and I don’t consider myself above criticism.
On the other hand, it’s also kind of freaking me out to hear that a bunch of people in a city I’ve been in for like an hour tops are organizing an event using a derisive nickname for me and calling me a pseudointellectual, especially since I just sort of stumbled across it by coincidence.
I’m not sure how to balance these different considerations, and probably my feelings aren’t as important as moving the engine of intellectual progress, but for the record I’m not really happy with the attempt made to balance them here.
I don’t know if I am supposed to defend myself, but I will just say that I am particularly tired of criticism of the Dark Ages post. I’ve found this to have been a bunch of Redditors talking about how a freshman history student would have been ashamed to make so many howling mistakes, and then a bunch of trained historians telling me they thought it was basically fine (for example, here’s a professional medieval historian saying he agrees with it entirely, here’s a Renaissance historian who thinks it’s fine, here’s a historian of early modern science who says the same—also, I got an email from a Dominican friar who liked it, which is especially neat because it’s like my post on the Middle Ages getting approval from the Middle Ages). I’m not saying this to make an argument from authority, I’m saying it because the people who disagree with me keep trying to make an argument from authority, and I don’t want people to fall for it.
And, okay, one more thing. My Piketty review begins ” I am not an economist. Many people who are economists have reviewed this book already. I review it only because if I had to slog through reading this thing I at least want to get a blog post out of it. If anything in my review contradicts that of real economists, trust them instead of me ” If you’re using errors in it to call me a pseudo-intellectual, I feel like you’re just being a jerk at this point. Commenters did find several ways I was insufficiently critical of Piketty’s claims, which I describe here ; I also added a correction note to that effect to the original post. The post was nevertheless recommend by an economist who said it was “the best summary I’ve ever read from a non-economist”. Again, I’m not saying this as an argument from authority, I’m saying it because I know from experience that the criticism is going to involve a claim that “it’s so bad that no knowledgeable person would ever take it seriously”, and now you’ll know that’s not true.
I’ve increased my monthly donation to $600. Thanks again to Sarah and everyone else who works on this.
The 5 AM time looks like a mistake. David told me it’s at 2 PM.