Thanks for this summary! This is a very important thing for growing of the community.
I was thinking about whether being “too nerdy, weird, or socially awkward” is a bug or a feature, but it seems to me that we need to be more specific, to look into details. Some things in our community are inherently weird (unusual in the everyday discourse); debating artificial intelligence, for example. But some forms of social awkwardness (harassment, boredom, unproductive debates) can—and should—be fixed; I mean, not just for the PR purposes, but because that also is a part of “becoming stronger”. Let’s see how far towards pleasant interaction can we go without sacrificing other values (such as honesty). I guess we can—and should—improve here a lot.
Maybe it’s an issue of going meta at solving the wrong problem. If I want to have a group of people who talk about artificial intelligence, I must focus not only on the “artificial intelligence” part, but also on the “having a group of people” part. This is probably our blind spot, because the former feels like an academic subject, while the latter feels almost like an opposite to the academia (so we are even tempted to countersignal our sophistication by being bad a it). People can get Nobel price for being educated, but nobody gets a price for making an environment where the former are happy to meet, debate, learn, and discover. All winning comes from people, and yet supporting other people in their winning is somehow low-status (as in: you are unable to win on your own, therefore the best you can do is to support others). Please note that this is specifically an academic bias—in business, you can make a lot of money and status by creating stuff that other people need.
When we try to build the community, then “building the community” is the topic to focus on. Yeah, it can feel like making a community for the sake of making a community, which would be a lost purpose. But, some things are true for communities in general, because they are true for humans in general, and if we want to have a good community, we have to study that. Also, not everyone has to focus on this, but someone should—preferably more than one person, so they can talk and share ideas. If you want to have a meetup debating artificial intelligence (or whatever else), create a subgroup that focuses on the topic, and a subgroup which focuses on the community. Both are necessary.
Bringing a box of cookies to the AI debate meetup could be more important than bringing an article about the latest discovery in AI. (And bringing an article about the latest discovery in AI is still preferable to just talking without really learning.) No, we don’t want to get to the point where everyone brings cookies and no one debates LW topics—but I suspect that even this strawman example is closer to a healthy and productive community than where many of us are now.
We need to apply our rationality, and to specifically apply it at creating rationalist communities. Yes, it is difficult. That shouldn’t be a reason to avoid it, but a reason to focus on it. It is a problem to be solved. And it will not be solved by anyone other than us.
Let’s see how far towards pleasant interaction can we go without sacrificing other values (such as honesty).
I rather suspect—and this is me talking, not my interpretation of the survey data—that this already concedes too much. I’ve talked to LWers who appeared to be hung up on honesty to the point of kneecapping themselves socially: not just preferring a more explicit interaction style, but outright refusing to deal with people who partake in perfectly normal social untruths. These sorts of extremes don’t seem to be common, but insofar as they’re a problem in some segments of the community, they’re not going to be solved without at least a few concessions against existing values.
Properly exploring this would probably take a top-level post, but I think I can summarize by saying I agree with ChrisHallquist here.
Thanks for this summary! This is a very important thing for growing of the community.
I was thinking about whether being “too nerdy, weird, or socially awkward” is a bug or a feature, but it seems to me that we need to be more specific, to look into details. Some things in our community are inherently weird (unusual in the everyday discourse); debating artificial intelligence, for example. But some forms of social awkwardness (harassment, boredom, unproductive debates) can—and should—be fixed; I mean, not just for the PR purposes, but because that also is a part of “becoming stronger”. Let’s see how far towards pleasant interaction can we go without sacrificing other values (such as honesty). I guess we can—and should—improve here a lot.
Maybe it’s an issue of going meta at solving the wrong problem. If I want to have a group of people who talk about artificial intelligence, I must focus not only on the “artificial intelligence” part, but also on the “having a group of people” part. This is probably our blind spot, because the former feels like an academic subject, while the latter feels almost like an opposite to the academia (so we are even tempted to countersignal our sophistication by being bad a it). People can get Nobel price for being educated, but nobody gets a price for making an environment where the former are happy to meet, debate, learn, and discover. All winning comes from people, and yet supporting other people in their winning is somehow low-status (as in: you are unable to win on your own, therefore the best you can do is to support others). Please note that this is specifically an academic bias—in business, you can make a lot of money and status by creating stuff that other people need.
When we try to build the community, then “building the community” is the topic to focus on. Yeah, it can feel like making a community for the sake of making a community, which would be a lost purpose. But, some things are true for communities in general, because they are true for humans in general, and if we want to have a good community, we have to study that. Also, not everyone has to focus on this, but someone should—preferably more than one person, so they can talk and share ideas. If you want to have a meetup debating artificial intelligence (or whatever else), create a subgroup that focuses on the topic, and a subgroup which focuses on the community. Both are necessary.
Bringing a box of cookies to the AI debate meetup could be more important than bringing an article about the latest discovery in AI. (And bringing an article about the latest discovery in AI is still preferable to just talking without really learning.) No, we don’t want to get to the point where everyone brings cookies and no one debates LW topics—but I suspect that even this strawman example is closer to a healthy and productive community than where many of us are now.
We need to apply our rationality, and to specifically apply it at creating rationalist communities. Yes, it is difficult. That shouldn’t be a reason to avoid it, but a reason to focus on it. It is a problem to be solved. And it will not be solved by anyone other than us.
I rather suspect—and this is me talking, not my interpretation of the survey data—that this already concedes too much. I’ve talked to LWers who appeared to be hung up on honesty to the point of kneecapping themselves socially: not just preferring a more explicit interaction style, but outright refusing to deal with people who partake in perfectly normal social untruths. These sorts of extremes don’t seem to be common, but insofar as they’re a problem in some segments of the community, they’re not going to be solved without at least a few concessions against existing values.
Properly exploring this would probably take a top-level post, but I think I can summarize by saying I agree with ChrisHallquist here.