The advice, the tone, the vibe ‘feels’ wrong, somehow. If you forced me to use more precise language, I might say that, for several years now, I have kept a variety of procedural heuristics running in the background that help me ferret out bullshit, partisanship, wishful thinking, and other unsound debating tactics—and important content on this website manages to trigger most of them.
To come up with a theory on the fly, maybe there are two modes of expansion for a group: by providing some service, and by sheer memetic virulence. One memetic virulence strategy operates by making outlandish promises that subscribing to it will make you smarter, richer, more successful, more attractive to the opposite sex, and just plain superior to other people—and then doing it in a way that can’t obviously be proven wrong. This strategy usually involves people with loads of positive affect going around telling people how great their group is and how they need to join.
As a memetic defense strategy, people learn to identify this kind of spread and to shun groups that display its features. From the inside, this strategy manifests as a creepy feeling.
LW members have lots of positive affect around LW and express it all the time, the group seems to be growing without providing any services (eg no one’s worried when a drama club grows, because they go and put on dramas, but it’s not clear what our group is doing besides talking about how great we are), and we make outrageously bold claims about getting smarter and richer and sexier of the sort which virulent memes trying to propagate would have to make.
I don’t think this creepiness detector operates on the conscious level, any more than the chance-of-catching-a-disease detector that tells us people with open sores all over their body are disgusting operates on a conscious level. We don’t stop considering the open sores disgusting if we learn they’re not really contagious, and we don’t stop considering overblown self-improvement claims from an actively recruiting community to be a particularly virulent memetic vector even if we don’t consciously believe that’s what’s going on.
(I’m still agnostic on that point. I’m sure no one intended this to be a meme optimized for self-propagation through outlandish promises, but it’s hard to tell if it’s started mutating that way.)
we make outrageously bold claims about getting smarter and richer and sexier
I’d like to know where all this LW-boasting is going on. I don’t think I hear it at the meetups in Mountain View, but maybe I’ve been missing something.
Darnit, I don’t like being vague and I also don’t like pointing to specific people and saying “YOU! YOU SOUND CULTISH!” so I’m going to have a hard time answering this question in a satisfying way, but...
Lots of people are looking into things like nootropics/intelligence amplification, entrepreneurship, and pick-up artistry. And this is great. What gives me the creepy vibe is when they say (more on the site than at meetups) “And of course, we’ll succeed at these much faster than other people have, even though they are professionals in this field, because we’re Rationalists and they weren’t.” Anything involving the words “winning”, “awesomeness”, or gratuitous overuse of community identification terms like “primate” or “utility”.
Trying to look for examples, I notice it is a smaller proportion of things than I originally thought and I’m probably biased toward overcounting them, which makes sense since in order to justify my belonging to a slightly creepy group I need to exaggerate my opposition to the group’s creepiness.
Nonetheless, perhaps we need to adopt a new anti-cultishness norm against boasting about the predicted success of rationalists; or against ascribing personal victories to one’s rationality without having actually done the math to demonstrate the correlation between success and rationality. The cult attractor is pretty damn bad, after all, and ending up in it could easily destroy one hell of a lot of value.
One memetic virulence strategy operates by making outlandish promises that subscribing to it will make you smarter, richer, more successful, more attractive to the opposite sex, and just plain superior to other people—and then doing it in a way that can’t obviously be proven wrong.
That similarity is the key to both the perceived creepiness factor and the signal:noise ratio on this site. Groups formed to provide a service have performance standards that their members must achieve and maintain: drama clubs and sports teams have tryouts, jobs have interviews, schools have GPA requirements, etc. By contrast, groups serving as vehicles for contagious memes avoid standards. Every believer, even if personally useless to the stated aims of the group, is a potential transmission vector.
I see two reasons to care which of those classes of groups LW more closely resembles: first, to be aware of how we’re coming across to others; and second, as a measure of whether anything is actually being accomplished here.
Personally, I try to avoid packaging LW’s community and content into an indivisible bundle. From Resist the Happy Death Spiral:
To summarize, you do avoid a Happy Death Spiral by (1) splitting the Great Idea into parts (2) treating every additional detail as burdensome (3) thinking about the specifics of the causal chain instead of the good or bad feelings (4) not rehearsing evidence (5) not adding happiness from claims that “you can’t prove are wrong”; but not by (6) refusing to admire anything too much (7) conducting a biased search for negative points until you feel unhappy again (8) forcibly shoving an idea into a safe box.
There are a great many insightful posts on LW, mostly from Eliezer, Yvain, and a few others. There are other posts that are less specific and of correspondingly smaller insight. There is also a community centered in the discussion section that spends most of its time espousing the beliefs in the main post. Rather than allowing all these ideas to prop each other up, I’m content to wield the supported and useful techniques and discard the rest.
LW members have lots of positive affect around LW and express it all the time, the group seems to be growing without providing any services (eg no one’s worried when a drama club grows, because they go and put on dramas, but it’s not clear what our group is doing besides talking about how great we are), and we make outrageously bold claims about getting smarter and richer and sexier of the sort which virulent memes trying to propagate would have to make.
To come up with a theory on the fly, maybe there are two modes of expansion for a group: by providing some service, and by sheer memetic virulence. One memetic virulence strategy operates by making outlandish promises that subscribing to it will make you smarter, richer, more successful, more attractive to the opposite sex, and just plain superior to other people—and then doing it in a way that can’t obviously be proven wrong. This strategy usually involves people with loads of positive affect going around telling people how great their group is and how they need to join.
As a memetic defense strategy, people learn to identify this kind of spread and to shun groups that display its features. From the inside, this strategy manifests as a creepy feeling.
LW members have lots of positive affect around LW and express it all the time, the group seems to be growing without providing any services (eg no one’s worried when a drama club grows, because they go and put on dramas, but it’s not clear what our group is doing besides talking about how great we are), and we make outrageously bold claims about getting smarter and richer and sexier of the sort which virulent memes trying to propagate would have to make.
I don’t think this creepiness detector operates on the conscious level, any more than the chance-of-catching-a-disease detector that tells us people with open sores all over their body are disgusting operates on a conscious level. We don’t stop considering the open sores disgusting if we learn they’re not really contagious, and we don’t stop considering overblown self-improvement claims from an actively recruiting community to be a particularly virulent memetic vector even if we don’t consciously believe that’s what’s going on.
(I’m still agnostic on that point. I’m sure no one intended this to be a meme optimized for self-propagation through outlandish promises, but it’s hard to tell if it’s started mutating that way.)
I’d like to know where all this LW-boasting is going on. I don’t think I hear it at the meetups in Mountain View, but maybe I’ve been missing something.
Darnit, I don’t like being vague and I also don’t like pointing to specific people and saying “YOU! YOU SOUND CULTISH!” so I’m going to have a hard time answering this question in a satisfying way, but...
Lots of people are looking into things like nootropics/intelligence amplification, entrepreneurship, and pick-up artistry. And this is great. What gives me the creepy vibe is when they say (more on the site than at meetups) “And of course, we’ll succeed at these much faster than other people have, even though they are professionals in this field, because we’re Rationalists and they weren’t.” Anything involving the words “winning”, “awesomeness”, or gratuitous overuse of community identification terms like “primate” or “utility”.
Trying to look for examples, I notice it is a smaller proportion of things than I originally thought and I’m probably biased toward overcounting them, which makes sense since in order to justify my belonging to a slightly creepy group I need to exaggerate my opposition to the group’s creepiness.
Nonetheless, perhaps we need to adopt a new anti-cultishness norm against boasting about the predicted success of rationalists; or against ascribing personal victories to one’s rationality without having actually done the math to demonstrate the correlation between success and rationality. The cult attractor is pretty damn bad, after all, and ending up in it could easily destroy one hell of a lot of value.
This is a great idea!
That similarity is the key to both the perceived creepiness factor and the signal:noise ratio on this site. Groups formed to provide a service have performance standards that their members must achieve and maintain: drama clubs and sports teams have tryouts, jobs have interviews, schools have GPA requirements, etc. By contrast, groups serving as vehicles for contagious memes avoid standards. Every believer, even if personally useless to the stated aims of the group, is a potential transmission vector.
I see two reasons to care which of those classes of groups LW more closely resembles: first, to be aware of how we’re coming across to others; and second, as a measure of whether anything is actually being accomplished here.
Personally, I try to avoid packaging LW’s community and content into an indivisible bundle. From Resist the Happy Death Spiral:
There are a great many insightful posts on LW, mostly from Eliezer, Yvain, and a few others. There are other posts that are less specific and of correspondingly smaller insight. There is also a community centered in the discussion section that spends most of its time espousing the beliefs in the main post. Rather than allowing all these ideas to prop each other up, I’m content to wield the supported and useful techniques and discard the rest.
reminder that it wasn’t always like this