How much experience have you had watching the trajectory of online communities?
Have you for example informed yourself of the case of Reddit (the original one) which is particularly relevant to this community in that the software is so similar?
I have not studied this with any rigor, although I have seen communities that I previously enjoyed enter periods of decline (sometimes recovering at a later point, sometimes not). I don’t disagree that with online communities, there is often some tipping point when the bad reasoning/noise outweighs the good. That’s why I also made this part of my comment:
From my perspective, I don’t think we’ve reached the point where LW is so crowded with posts that good ideas and posts are being crowded out by bad ones.
Perhaps I’m wrong about this. At any rate, if LW is actually in a serious period of decline, the problem is more serious than just WrongBot, and I disagree with implementing a solution where individual posters take it upon themselves to ask other posters to leave. (If EY wants to create some sort of system like Paul Graham’s or make new moderators with these sorts powers, that would be different in my view than this sort of ad hoc approach, which doesn’t seem likely to work (due to both its ad hoc nature and unenforceability) and also presents greater risks of abuse, decisions based on personality conflict, etc.)
if LW is actually in a serious period of decline, the problem is more serious than just WrongBot
Agreed. In particular, LW has successfully weathered long flurries of comments and posts by people worse than WrongBot.
The primary sign that LW is in danger of becoming the kind of place that I and those I admire no longer want to visit is the (negative) magnitude of the score on comments asking WrongBot to stop writing on things beyond his skill and the (positive) magnitude of the scores of WrongBot’s replies to those (negatively scored) comments. That is new.
Note that the vast majority of readers of LW never attempt to create evolutionary arguments relevant to human behavior or summarize novel arguments made by others. I would hope that that is because they realize that it is too difficult for them.
I have not studied this with any rigor, although I have seen communities that I previously enjoyed enter periods of decline (sometimes recovering at a later point, sometimes not). I don’t disagree that with online communities, there is often some tipping point when the bad reasoning/noise outweighs the good. That’s why I also made this part of my comment:
Perhaps I’m wrong about this. At any rate, if LW is actually in a serious period of decline, the problem is more serious than just WrongBot, and I disagree with implementing a solution where individual posters take it upon themselves to ask other posters to leave. (If EY wants to create some sort of system like Paul Graham’s or make new moderators with these sorts powers, that would be different in my view than this sort of ad hoc approach, which doesn’t seem likely to work (due to both its ad hoc nature and unenforceability) and also presents greater risks of abuse, decisions based on personality conflict, etc.)
Agreed. In particular, LW has successfully weathered long flurries of comments and posts by people worse than WrongBot.
The primary sign that LW is in danger of becoming the kind of place that I and those I admire no longer want to visit is the (negative) magnitude of the score on comments asking WrongBot to stop writing on things beyond his skill and the (positive) magnitude of the scores of WrongBot’s replies to those (negatively scored) comments. That is new.
Note that the vast majority of readers of LW never attempt to create evolutionary arguments relevant to human behavior or summarize novel arguments made by others. I would hope that that is because they realize that it is too difficult for them.