It can be hard to find good content in the diaspora. Possible solution: Weekly “diaspora roundup” posts to Less Wrong. I’m too busy to do this, but anyone else is more than welcome to (assuming both people reading LW and people in the diaspora want it).
This is what /r/RationalistDiaspora was intended to do. It never really got traction, and is basically dead now, but it still strikes me as a good solution. If that’s not going to revive though, I agree that a weekly thread on LW is worth trying. By default, I’ll make one later this week. (I’m not currently sure I’ll have anything to post in it myself, I’ll be asking people to post links in the comments.)
Go tell Scott Alexander you’ll build an online forum to his specification, with SSC community feedback, to provide a better solution for his overflowing open threads.
He tried to move people to /r/SlateStarCodex, but that didn’t work. We’d want to understand why. (Some hypotheses: it wasn’t actually on SSC, where people go directly; posts there don’t pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
As Scott features more and more posts, he gains a moderation team full of people who wrote posts that were good enough to feature.
I’m not sure that “writes good posts” and “would make a good moderator” are sufficiently correlated for this to work. A lot of people like Eliezer’s writing but dislike his approach to moderation.
(On the other hand: maybe, if we want Eliezers to stick around, we need them to be able to shape the community? Even if that means upsetting people who don’t write much.)
It also creates weird incentives, like: “I liked this post that was highly critical of our community, but I don’t want the author to be a mod”. (This is the problem that Scott Aa points to of “this system can only improve on ordinary democracy if the trust network has some other purpose”—I worry that voting-for-comment-scores isn’t a sufficiently strong purpose to outweigh voting-for-moderators.)
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others’, they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I’m not convinced that this works either.)
He tried to move people to /r/SlateStarCodex, but that didn’t work.
He didn’t really try. All he did was mention offhand a couple of times that if people are unhappy with how the comment section works, there is the subreddit and it looks reasonable to him.
It would not be hard for Scott to move people to subreddit: put a link to it at the end of each article + just go there and respond to comments in the subreddit.
He tried to move people to /r/SlateStarCodex, but that didn’t work. We’d want to understand why. (Some hypotheses: it wasn’t actually on SSC, where people go directly; posts there don’t pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
I think a big explanation is that /r/SlateStarCodex was not advertised sufficiently, and people never developed the habit of visiting there. I imagine that if Scott chose to highlight great comments or self posts from /r/SlateStarCodex each week, the subreddit would grow faster, for instance.
Online communities are Schelling points. People want to be readers in the community where all the writers are, and vice versa. Force of habit keeps people visiting the same places over and over again, but if they don’t feel reinforced through interesting content to read / recognition of their writing, they’re liable to go elsewhere. The most likely explanation for why any online community fails, including stuff like /r/RationalistDiaspora and /r/SlateStarCodex, is that it never becomes a Schelling point. My explanation for why LW has lost traffic: there was a feedback loop involving people not being reinforced for writing and LW gradually losing its strength as a Schelling point.
Edit: also, subreddits are better suited to link sharing than original posts IMO.
I’m not sure that “writes good posts” and “would make a good moderator” are sufficiently correlated for this to work. A lot of people like Eliezer’s writing but dislike his approach to moderation.
Acknowledged, but as long as the correlation is above 0, I suspect it’s a better system than what reddit has, where ability to vote is based on possession of a warm body.
It also creates weird incentives, like: “I liked this post that was highly critical of our community, but I don’t want the author to be a mod”.
Concrete example: Holden Karnofsky’s critical post was liked by many people. Holden has posted other stuff too, and his karma is 3689. That would give him about 1% of Eliezer’s influence, 4% of Yvain’s influence, or 39% of my influence. This doesn’t sound upsetting to me and I doubt it would upset many others. If Holden was able to, say, collect mucho karma by writing highly upvoted rebuttals of every individual sequence post, then maybe he should be the new LW moderator-in-chief.
But even if you’re sure this is a problem, it’d be simple to add another upvote option that increases visibility without bestowing karma. I deliberately kept my proposal simple because I didn’t want to take away the fun of hashing out details from other people :) I’m in favor of giving Scott Alexander “god status” (ability to edit the karma for every person and post) until all the incentive details are worked out, and maybe even after that. In the extreme, the system I describe is simply a tool to lighten Scott’s moderation load.
(This is the problem that Scott Aa points to of “this system can only improve on ordinary democracy if the trust network has some other purpose”—I worry that voting-for-comment-scores isn’t a sufficiently strong purpose to outweigh voting-for-moderators.)
So I guess the analogy here would be if I want a particular user to have more influence, I’d vote up a post of theirs that I didn’t think was very good in order to give them that influence? I guess this is a problem that would need to be dealt with. Some quick thoughts on solutions: Anonymize posts before they’re voted on. Give Scott the ability to “punish” everyone who voted up a particularly bad post and lessen their moderation abilities.
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others’, they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I’m not convinced that this works either.)
A related idea that might work better: Make it so downvotes work to decrease the karma score of everyone who upvoted a particular thing. This incentivizes upvoting things that people won’t find upsetting, which works against the sort of controversy the rest of the internet incentivizes. But there’s no Keynesian beauty contest because you can never gain points through upvoting, only lose them. This also creates the possibility that there will be a cost associated with upvoting a thing, which makes karma a bit more like currency (not necessarily a bad thing).
The Less Wrong diaspora demonstrates that the toughest competition for online forums may be individual personal blogs. By writing on your personal blog, you build up you own status & online presence. To be more competitive with personal blogs, it might make sense to give high-karma users of a hypothetical SSC forum the ability to upvote their own posts multiple times, in addition to those of others. That way if I have a solid history of making quality contributions, I’d also have the ability to upvote a new post of mine multiple times if it was an idea I really wanted to see get out there, in the same way a person with a widely read personal blog has the ability to really get an idea out there. The mechanism I outlined above (downvotes taking away karma from the people who upvoted a thing) could prevent abuse of self-upvoting: if I self-upvote my own post massively, but it turns out to be lousy, other people will downvote it, and I’ll lose some of the karma that gave me the ability to self-upvote massively.
(I’ve mostly only skimmed.)
This is what /r/RationalistDiaspora was intended to do. It never really got traction, and is basically dead now, but it still strikes me as a good solution. If that’s not going to revive though, I agree that a weekly thread on LW is worth trying. By default, I’ll make one later this week. (I’m not currently sure I’ll have anything to post in it myself, I’ll be asking people to post links in the comments.)
He tried to move people to /r/SlateStarCodex, but that didn’t work. We’d want to understand why. (Some hypotheses: it wasn’t actually on SSC, where people go directly; posts there don’t pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
I’m not sure that “writes good posts” and “would make a good moderator” are sufficiently correlated for this to work. A lot of people like Eliezer’s writing but dislike his approach to moderation.
(On the other hand: maybe, if we want Eliezers to stick around, we need them to be able to shape the community? Even if that means upsetting people who don’t write much.)
It also creates weird incentives, like: “I liked this post that was highly critical of our community, but I don’t want the author to be a mod”. (This is the problem that Scott Aa points to of “this system can only improve on ordinary democracy if the trust network has some other purpose”—I worry that voting-for-comment-scores isn’t a sufficiently strong purpose to outweigh voting-for-moderators.)
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others’, they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I’m not convinced that this works either.)
He didn’t really try. All he did was mention offhand a couple of times that if people are unhappy with how the comment section works, there is the subreddit and it looks reasonable to him.
It would not be hard for Scott to move people to subreddit: put a link to it at the end of each article + just go there and respond to comments in the subreddit.
I think a big explanation is that /r/SlateStarCodex was not advertised sufficiently, and people never developed the habit of visiting there. I imagine that if Scott chose to highlight great comments or self posts from /r/SlateStarCodex each week, the subreddit would grow faster, for instance.
Online communities are Schelling points. People want to be readers in the community where all the writers are, and vice versa. Force of habit keeps people visiting the same places over and over again, but if they don’t feel reinforced through interesting content to read / recognition of their writing, they’re liable to go elsewhere. The most likely explanation for why any online community fails, including stuff like /r/RationalistDiaspora and /r/SlateStarCodex, is that it never becomes a Schelling point. My explanation for why LW has lost traffic: there was a feedback loop involving people not being reinforced for writing and LW gradually losing its strength as a Schelling point.
Edit: also, subreddits are better suited to link sharing than original posts IMO.
Acknowledged, but as long as the correlation is above 0, I suspect it’s a better system than what reddit has, where ability to vote is based on possession of a warm body.
Concrete example: Holden Karnofsky’s critical post was liked by many people. Holden has posted other stuff too, and his karma is 3689. That would give him about 1% of Eliezer’s influence, 4% of Yvain’s influence, or 39% of my influence. This doesn’t sound upsetting to me and I doubt it would upset many others. If Holden was able to, say, collect mucho karma by writing highly upvoted rebuttals of every individual sequence post, then maybe he should be the new LW moderator-in-chief.
But even if you’re sure this is a problem, it’d be simple to add another upvote option that increases visibility without bestowing karma. I deliberately kept my proposal simple because I didn’t want to take away the fun of hashing out details from other people :) I’m in favor of giving Scott Alexander “god status” (ability to edit the karma for every person and post) until all the incentive details are worked out, and maybe even after that. In the extreme, the system I describe is simply a tool to lighten Scott’s moderation load.
So I guess the analogy here would be if I want a particular user to have more influence, I’d vote up a post of theirs that I didn’t think was very good in order to give them that influence? I guess this is a problem that would need to be dealt with. Some quick thoughts on solutions: Anonymize posts before they’re voted on. Give Scott the ability to “punish” everyone who voted up a particularly bad post and lessen their moderation abilities.
I share your skepticism.
A related idea that might work better: Make it so downvotes work to decrease the karma score of everyone who upvoted a particular thing. This incentivizes upvoting things that people won’t find upsetting, which works against the sort of controversy the rest of the internet incentivizes. But there’s no Keynesian beauty contest because you can never gain points through upvoting, only lose them. This also creates the possibility that there will be a cost associated with upvoting a thing, which makes karma a bit more like currency (not necessarily a bad thing).
The Less Wrong diaspora demonstrates that the toughest competition for online forums may be individual personal blogs. By writing on your personal blog, you build up you own status & online presence. To be more competitive with personal blogs, it might make sense to give high-karma users of a hypothetical SSC forum the ability to upvote their own posts multiple times, in addition to those of others. That way if I have a solid history of making quality contributions, I’d also have the ability to upvote a new post of mine multiple times if it was an idea I really wanted to see get out there, in the same way a person with a widely read personal blog has the ability to really get an idea out there. The mechanism I outlined above (downvotes taking away karma from the people who upvoted a thing) could prevent abuse of self-upvoting: if I self-upvote my own post massively, but it turns out to be lousy, other people will downvote it, and I’ll lose some of the karma that gave me the ability to self-upvote massively.
StackExchange uses a flag weight model. They removed it from the visible section of the profile (http://meta.stackexchange.com/questions/119715/what-happened-to-flag-weight) but I think they still use it internally.