I’ve been reading Less Wrong from its beginning. I stumbled upon Overcoming Bias just as LW was being launched. I’m a young mathematician (an analyst, to be more specific) currently working towards a PhD and I’m very interested in epistemic rationality and the theory of altruist instrumental rationality. I’ve been very impressed with the general quality of discussion about the theory and general practice of truth-seeking here, even though I can think of places where I disagree with the ideas that I gather are widely accepted here. The most interesting discussions seem to be quite old, though, so reviving those discussions out of the blue hasn’t felt like—for lack of a better word—a proper thing to do.
There are many discussions here of which I don’t care about. A large proportion of people here are programmers or otherwise from a CS background, and that colors the discussions a lot. Or maybe it’s just that the prospect of an AGI in recent future doesn’t seem at all likely to me. Anyway, the AI/singularity stuff, the tangentially related topics that I bunch together with them, and approaching rationality topics from a programmer’s point of view I just don’t care about. Not very much, at least.
The self-help stuff, “winning is everything” and related stuff I’d rather not read. Well, I do my best not to. The apparent lack of concern for altruism in those discussions makes me even wish they wouldn’t take place here in the first place.
And then there are the true failings of this community. I had been thinking of registering and posting in some threads about the more abstract sides of rationality, but I must admit I eventually got around to registering and posting because of the gender threads. But there’s just so much bullshit going on! Evolutionary psychology is grossly misapplied (1). The obvious existence of oppressive cultural constructs (2) is flatly denied. The validity of anecdotes and speculation as evidence is hardly even questioned. The topics that started the flaming have no reason of even being here in the first place. This post pretty well sums up the failures of rationality here at Less Wrong; and that post has been upvoted to 25! Now, the failings and attitudes that surfaced in the gender debate have, of course, been visible for quite some time. But that the failures of thought seem so common has made me wonder if this community as a whole is actually worth wasting my time for.
So, in case you’re still wondering, what has generously been termed “exclusionary speech” really drives people away (3). I’m still hoping that the professed rationality is enough to overcome the failure modes that are currently so common here (4). But unfortunately I think my possible contributions won’t be missed if I rid myself of wishful thinking and see it’s not going to happen.
It’s quite a shame that a community with such good original intentions is failing after a good start. Maybe humans simply won’t overcome their biases (5) yet in this day and age.
So. I’d really like to participate in thoughtful discussions with rationalists I can respect. For quite a long time, Less Wrong seemed like the place, but I just couldn’t find a proper place to start (I dislike introductions). But now as I’m losing my respect for this community and thus the will to participate here, I started posting. I hope I can regain the confidence in a high level of sanity waterline here.
(Now a proper rationalist would, in my position, naturally reconsider his own attitudes and beliefs. It might not be surprising that I didn’t find all too much to correct. So I might just as well assume that I haven’t been mind-killed quite yet, and just make the post I wanted to.)
EDIT: In case you felt I was generalizing with too much confidence—and as I wrote here, I agree I was—see my reply to Vladimir Nesov’s reply.
(1) I think failing to control for cultural influences in evolutionary psychology should be considered at least as much of a fail as postulating group selection. Probably more so.
(2) Somehow I think phrases like “cultural construct”, especially when combined with qualifiers like “oppressive”, trigger immediate bullshit alarms for some. To a certain extent, it’s forgivable, as they certainly have been used in conjunction with some of the most well-known anti-epistemologies of our age. But remember: reversing stupidity doesn’t make you any better off.
(5) Overcoming Bias is quite an ironic name for that blog. EDIT: This refers exclusively to many of Robin Hanson’s posts about gender differences I have read. I think I saw a post linking to some of these recently, but I couldn’t find a link to that just now. Anyway, this footnote probably went a bit too far.
Somehow I think phrases like “cultural construct”, especially when combined with qualifiers like “oppressive”, trigger immediate bullshit alarms for some. To a certain extent, it’s forgivable, as they certainly have been used in conjunction with some of the most well-known anti-epistemologies of our age. But remember: reversing stupidity doesn’t make you any better off.
I appreciate your honest criticisms here, as someone who participated (probably too much) in the silly gender discussion threads.
I also encourage you to stay and participate, if possible. Despite some missteps, I think there’s a lot of potential in this community, and I’d hate to see us losing people who could contribute interesting material.
The evils of in-group bias are getting at me. I felt a bit of anger when reading this comment. Go figure, I rarely feel noticeable emotions, even in response to dramatic events. The only feature that could trigger that reaction seems to be the dissenting theme of this comment, the way it breached the normal narrative of the game of sane/insane statements. I wrote a response after a small time-out, I hope it isn’t tainted by that unfortunate reaction.
I don’t think it’s in-group bias. If anything, people are giving mni extra latitude because he or she is seen as new here.
If an established member of the community were to make the same points, that much of the discussion is uninteresting or bullshit, that the community is failing and maybe not worth “wasting” time for, and to claim to have interesting things to say but make excuses for not actually saying them, I bet there would be a lot more criticism in response.
As I wrote, anger is an improbable reaction for me, and there doesn’t seem to be anything extraordinarily angering about that comment, so I can’t justify that emotion appearing in this particular instance. The fact that the poster isn’t a regular might be a factor as well.
Interesting. You provide one counterexample to my opinion that the biased language wasn’t driving away readers. I now have reason to believe I might have been projecting too much.
One thing I hope you have noticed is that there are different subgroups of people within the community that like or dislike certain topics. Adding content that you prefer is a good way to see more growth in those topics.
mni, I followed in your footsteps years later, and then dropped away, just as you did. I came back after several months to look for an answer to a specific question—stayed for a bit, poking around—and before I go away again, I’d just like to say: if this’d been a community that was able to keep you, it probably would have kept me too.
You seem awesome. Where did you go? Can I follow you there?
I see people leave Less Wrong for similar reasons all the time. In my optimistic moods, I try to understand the problem and think up ways to fix it. In my pessimistic moods, this blog and its meetups are doomed from the start; the community will retain only those women who are already dating people in the community; and the whole thing will end in a whimper.
I’m still hoping that the professed rationality is enough to overcome the failure modes that are currently so common here[.] But unfortunately I think my possible contributions won’t be missed if I rid myself of wishful thinking and see it’s not going to happen. [...] I’d really like to participate in thoughtful discussions with rationalists I can respect. For quite a long time, Less Wrong seemed like the place, but I just couldn’t find a proper place to start (I dislike introductions). But now as I’m losing my respect for this community and thus the will to participate here, I started posting. I hope I can regain the confidence in a high level of sanity waterline here.
I assume that you are overconfident about many of the statements you made (and/or underestimate the inferential gap). I agree with some things you’ve said, but about some of the things you’ve said there seems to be no convincing argument in sight (either way), and so one shouldn’t be as certain when passing judgment.
I think I understand your point about overconfidence. I had thought of the post for a day or two but I wrote it in one go, so I probably didn’t end up expressing myself as well as I could have. I had originally intended to include a disclaimer in my post, but for reasons that now seem obscure I left it out. When making as strong, generalizing statements as I did, the ambiguity of statements should be minimized a lot more thoroughly than I did.
So, to explain myself a little bit better: I don’t hold the opinion that what I called “bullshit” is common enough here to make it, in itself, a “failing of this community”. The “bullshit” was, after all, limited only to certain threads and to certain individuals. What I’m lamenting and attributing to the whole community is a failure to react to the “bullshit” properly. Of course, that’s a sweeping generalization in itself—certainly not everyone here failed to react in what I consider a proper way. But the widest consensus in the multitude of opinions seemed to be that the reaction might be hypersensitivity, and that the “bullshit” should be discouraged only because it offends and excludes people (and not because it offends and excludes people for irrational reasons).
And as for overconfidence about my assessment of the “bullshit” itself, I don’t really want to argue about that. Any more than I’d want to argue with people who think atheists should be excluded from public office. (Can you imagine an alternate LW in which the general consensus was that’s a reasonable, though extreme, position to take? That might give an only slightly exaggerated example of how bizarrely out of place I considered the gender debate to be.) If pressed, I will naturally agree to defend my statements. But I wouldn’t really want to have to, and restarting the debate isn’t probably in anyone else’s best interests either. So, I’ll just have to leave the matter as something that, in my perspective, lessens appreciation for the level of discourse here in quite a disturbing way. Still, that doesn’t mean that LW wouldn’t get the best marks from me as far as the rationality of internet communities I know is considered, or that a lowered single value for “the level of discourse” lessened my perception of the value of other contributions here.
Now the latest top-level post about critiquing Bayesianism look quite interesting, I think I’d like to take a closer look at that...
Hello.
I’ve been reading Less Wrong from its beginning. I stumbled upon Overcoming Bias just as LW was being launched. I’m a young mathematician (an analyst, to be more specific) currently working towards a PhD and I’m very interested in epistemic rationality and the theory of altruist instrumental rationality. I’ve been very impressed with the general quality of discussion about the theory and general practice of truth-seeking here, even though I can think of places where I disagree with the ideas that I gather are widely accepted here. The most interesting discussions seem to be quite old, though, so reviving those discussions out of the blue hasn’t felt like—for lack of a better word—a proper thing to do.
There are many discussions here of which I don’t care about. A large proportion of people here are programmers or otherwise from a CS background, and that colors the discussions a lot. Or maybe it’s just that the prospect of an AGI in recent future doesn’t seem at all likely to me. Anyway, the AI/singularity stuff, the tangentially related topics that I bunch together with them, and approaching rationality topics from a programmer’s point of view I just don’t care about. Not very much, at least.
The self-help stuff, “winning is everything” and related stuff I’d rather not read. Well, I do my best not to. The apparent lack of concern for altruism in those discussions makes me even wish they wouldn’t take place here in the first place.
And then there are the true failings of this community. I had been thinking of registering and posting in some threads about the more abstract sides of rationality, but I must admit I eventually got around to registering and posting because of the gender threads. But there’s just so much bullshit going on! Evolutionary psychology is grossly misapplied (1). The obvious existence of oppressive cultural constructs (2) is flatly denied. The validity of anecdotes and speculation as evidence is hardly even questioned. The topics that started the flaming have no reason of even being here in the first place. This post pretty well sums up the failures of rationality here at Less Wrong; and that post has been upvoted to 25! Now, the failings and attitudes that surfaced in the gender debate have, of course, been visible for quite some time. But that the failures of thought seem so common has made me wonder if this community as a whole is actually worth wasting my time for.
So, in case you’re still wondering, what has generously been termed “exclusionary speech” really drives people away (3). I’m still hoping that the professed rationality is enough to overcome the failure modes that are currently so common here (4). But unfortunately I think my possible contributions won’t be missed if I rid myself of wishful thinking and see it’s not going to happen.
It’s quite a shame that a community with such good original intentions is failing after a good start. Maybe humans simply won’t overcome their biases (5) yet in this day and age.
So. I’d really like to participate in thoughtful discussions with rationalists I can respect. For quite a long time, Less Wrong seemed like the place, but I just couldn’t find a proper place to start (I dislike introductions). But now as I’m losing my respect for this community and thus the will to participate here, I started posting. I hope I can regain the confidence in a high level of sanity waterline here.
(Now a proper rationalist would, in my position, naturally reconsider his own attitudes and beliefs. It might not be surprising that I didn’t find all too much to correct. So I might just as well assume that I haven’t been mind-killed quite yet, and just make the post I wanted to.)
EDIT: In case you felt I was generalizing with too much confidence—and as I wrote here, I agree I was—see my reply to Vladimir Nesov’s reply.
(1) I think failing to control for cultural influences in evolutionary psychology should be considered at least as much of a fail as postulating group selection. Probably more so.
(2) Somehow I think phrases like “cultural construct”, especially when combined with qualifiers like “oppressive”, trigger immediate bullshit alarms for some. To a certain extent, it’s forgivable, as they certainly have been used in conjunction with some of the most well-known anti-epistemologies of our age. But remember: reversing stupidity doesn’t make you any better off.
(3) This might be a good place to remind the reader that (our kind can’t cooperate)[http://lesswrong.com/lw/3h/why_our_kind_cant_cooperate/]. (This is actually referring to many aspects of the recent debate, not just one.)
(4) Yes, I know, I can’t cooperate either.
(5) Overcoming Bias is quite an ironic name for that blog. EDIT: This refers exclusively to many of Robin Hanson’s posts about gender differences I have read. I think I saw a post linking to some of these recently, but I couldn’t find a link to that just now. Anyway, this footnote probably went a bit too far.
Upvoted for this in particular.
I appreciate your honest criticisms here, as someone who participated (probably too much) in the silly gender discussion threads.
I also encourage you to stay and participate, if possible. Despite some missteps, I think there’s a lot of potential in this community, and I’d hate to see us losing people who could contribute interesting material.
The evils of in-group bias are getting at me. I felt a bit of anger when reading this comment. Go figure, I rarely feel noticeable emotions, even in response to dramatic events. The only feature that could trigger that reaction seems to be the dissenting theme of this comment, the way it breached the normal narrative of the game of sane/insane statements. I wrote a response after a small time-out, I hope it isn’t tainted by that unfortunate reaction.
I don’t think it’s in-group bias. If anything, people are giving mni extra latitude because he or she is seen as new here.
If an established member of the community were to make the same points, that much of the discussion is uninteresting or bullshit, that the community is failing and maybe not worth “wasting” time for, and to claim to have interesting things to say but make excuses for not actually saying them, I bet there would be a lot more criticism in response.
As I wrote, anger is an improbable reaction for me, and there doesn’t seem to be anything extraordinarily angering about that comment, so I can’t justify that emotion appearing in this particular instance. The fact that the poster isn’t a regular might be a factor as well.
Interesting. You provide one counterexample to my opinion that the biased language wasn’t driving away readers. I now have reason to believe I might have been projecting too much.
Welcome. :)
One thing I hope you have noticed is that there are different subgroups of people within the community that like or dislike certain topics. Adding content that you prefer is a good way to see more growth in those topics.
mni, I followed in your footsteps years later, and then dropped away, just as you did. I came back after several months to look for an answer to a specific question—stayed for a bit, poking around—and before I go away again, I’d just like to say: if this’d been a community that was able to keep you, it probably would have kept me too.
You seem awesome. Where did you go? Can I follow you there?
I see people leave Less Wrong for similar reasons all the time. In my optimistic moods, I try to understand the problem and think up ways to fix it. In my pessimistic moods, this blog and its meetups are doomed from the start; the community will retain only those women who are already dating people in the community; and the whole thing will end in a whimper.
This needs to be a primary concern during the setting-up of the rationality spin-off SIAI is planning. It needs to be done right, at the beginning.
Oh, please stay!
I assume that you are overconfident about many of the statements you made (and/or underestimate the inferential gap). I agree with some things you’ve said, but about some of the things you’ve said there seems to be no convincing argument in sight (either way), and so one shouldn’t be as certain when passing judgment.
Downvoted for lack of specifics.
I think I understand your point about overconfidence. I had thought of the post for a day or two but I wrote it in one go, so I probably didn’t end up expressing myself as well as I could have. I had originally intended to include a disclaimer in my post, but for reasons that now seem obscure I left it out. When making as strong, generalizing statements as I did, the ambiguity of statements should be minimized a lot more thoroughly than I did.
So, to explain myself a little bit better: I don’t hold the opinion that what I called “bullshit” is common enough here to make it, in itself, a “failing of this community”. The “bullshit” was, after all, limited only to certain threads and to certain individuals. What I’m lamenting and attributing to the whole community is a failure to react to the “bullshit” properly. Of course, that’s a sweeping generalization in itself—certainly not everyone here failed to react in what I consider a proper way. But the widest consensus in the multitude of opinions seemed to be that the reaction might be hypersensitivity, and that the “bullshit” should be discouraged only because it offends and excludes people (and not because it offends and excludes people for irrational reasons).
And as for overconfidence about my assessment of the “bullshit” itself, I don’t really want to argue about that. Any more than I’d want to argue with people who think atheists should be excluded from public office. (Can you imagine an alternate LW in which the general consensus was that’s a reasonable, though extreme, position to take? That might give an only slightly exaggerated example of how bizarrely out of place I considered the gender debate to be.) If pressed, I will naturally agree to defend my statements. But I wouldn’t really want to have to, and restarting the debate isn’t probably in anyone else’s best interests either. So, I’ll just have to leave the matter as something that, in my perspective, lessens appreciation for the level of discourse here in quite a disturbing way. Still, that doesn’t mean that LW wouldn’t get the best marks from me as far as the rationality of internet communities I know is considered, or that a lowered single value for “the level of discourse” lessened my perception of the value of other contributions here.
Now the latest top-level post about critiquing Bayesianism look quite interesting, I think I’d like to take a closer look at that...