Suggestions for LW features that could shape its culture by (dis)incentivizing certain behavior (without thinking about how hard they would be to implement):
On How LW Appears to Outside Readers
There’s a certain kind of controversial post that inevitably generates meta-discussion of whether people should be allowed to post it here (most recently in this book review). Crucially, the arguments I see there are not “I don’t like this” but usually “I’m afraid of what will happen when people who don’t like this see it, and associate LW with it”. (I found this really tedious, and would prefer a culture where we stick our own heads out and stick to “I don’t like this” rather than appealing to third parties.) Also, I wish there were a way to preempt this objection, so as to not fight the same battle over and over.
Other posts are in dispute (as in, have an unusually high fraction of downvotes), like jessicata’s post, but a casual reader might only see the post (and maybe its positive karma score), with all the controversy and nuance happening in the utterly impenetrable comments section.
So what might one do about that?
Some Reddit-style sites compute a controversy score (percentage of downvotes). Then one could sort by controversy, or by default filter controversial posts from the frontpage, or visibly flag controversial posts in a way people not familiar with LW would understand, or tag as “controversial”, or something.
If controversial posts are rare in number, this problem can also be tackled by mod intervention. For instance, mods could have the power to put a disclaimer (/ disclaimer template) above a post, or these could be triggered automatically by some specific metrics. Some (bad) examples:
“This post has a notable fraction of downvotes, and a high number of nested comment threads. This indicates that it’s in dispute. Check the comments for details.”
Or: a manual or automatic disclaimer on high-traffic non-frontpaged private blog posts (like the book review) to indicate to readers who come from elsewhere: This post is not frontpaged. LW has less strict moderation standars for private blogs. The karma score (upvotes) of posts reflect positive sentiment towards the contribution by the poster (e.g. taking the time to write a review), they’re not automatically an endorsement of <the reviewed item>. And so on.
Handling High-Stakes Controversies
Duncan’s post notes various ways in which recent controversies were not handled optimally. Some (probably bad) suggestions for site features to help handle such situations better:
The disclaimer thing, so new readers know that a post is controversial before they read it and take its claims at face value. This might also be warranted for some high-karma controversial comments.
Moderation
Mods could try to “turn down the heat” by setting stricter commenting guidelines, or temporarily prevent users below some karma threshold from posting, or something.
Flagging important clarifying comment threads so they appear higher in the comment order, or with extra highlighting.
Mark comments by moderators-acting-as-moderators with a flair or other highlighting.
On-site low-friction ways to post anonymously, with the option of leaking some non-identifying information like “I’ve been a LW user for >3 years with >1k karma”, or to ping specific LW users with “user X can vouch for my identity”, which user X could then confirm with a single click. Though I would not want this anonymity feature available on non-controversial posts.
A feature for a user to “request moderation for this post”, or “request stricter commenting guidelines” or to indicate “this post seems controversial to me” or something.
Of course such features don’t solve a problem by themselves, but they can help, and I’m more optimisic of attempts to improve a culture if the site infrastructure supports those attempts and incentivizes that improved culture.
Rewarding Exceptional Content
As noted in the LW book review bounty program, exceptional content on LW is potentially very valuable, so it makes sense to incentivize it. The karma system helps, but it’s not enough—as can be seen when extra incentives come into play, e.g. the extra reviews generated by the bounty program.
Some features beyond the karma system that could help here:
Reddit nowadays has a separate “awards” system which users can use to reward exceptional content. I don’t like the specific implementation at all—it’s full of one-upmanship of progressively more expensive awards, and posts with lots of awards just look cluttered—but one could imagine an implementation that would work here.
For instance, there are a number of active bounties on LW, but one could imagine a smaller-scale version of setting and rewarding bounties that would work better if built into the site, e.g. for Question posts (to reward the best answer), or just to gift someone money for writing a particularly important post or comment.
That said, this is the kind of thing that, if implemented suboptimally, could easily incentivize detrimental behavior, instead.
Or (like mentioned in the Controversies section), what about a system for users or mods to flag particularly high-quality or high-importance comments so they get some extra highlighting or something?
Related:
Because comments are much less discoverable than posts, lots of high-effort high-value comments, even if they’re very-high-karma, get lost in the masses of LW comments and are hard to find or refer to later on.
What could be done about that?
For instance, if users or mods see a comment thread of exceptional and enduring value, they could flag it as such (I already occasionally see follow-up comments of the form “This is good enough for a top-level post!”), and then others (volunteers or paid contributors) could turn the best ones of those into top-level posts, with karma going to the original posters.
(To end on a meta comment: I spent >2.5h on three high-effort comments in this thread, and would be disappointed if they got lost in the shuffle. Conversely, I’m more likely to make the effort in the future if I have a sense that it paid off in some way.)
First of all, thanks for your three comments, I think they provide valuable analysis and suggestions.
Another thing I notice about controversial posts: Because of how the front page works, posts that get a lot of comments get more exposure, because they show up in recent discussion, while posts that are correct and valuable but uncontroversial are likely to get less comments (even if everyone who upvotes them left a simple “good post” comment) unless they somehow manage to generate discussion.
I’m not entirely sure if it a “bug” or a “feature”. On the one hand, posts that are agreed to be good and valuable get drowned out, on the other hand, perhaps it’s exactly the controversial posts that deserve attention to resolve the controversy?
One way to counteract that is to leave more simple, generic comments like “Thanks for writing this”, “This was great”, “I enjoyed reading this”, etc. People (including me sometimes when I consider making them) worry about not adding anything substantial, but I think that’s not a problem, I like seeing these comments and the Karma system should help get the more substantial comments near the top.
That’s a social suggestion though, rather than a feature that can be implemented in the website. I don’t have an idea for a feature that could deal with that (the main ones currently are the ‘magic’ sorting on the frontpage, curation, and the frontpage recommendations, but I don’t think the latter two have a big impact on this).
For instance, there are a number of active bounties on LW, but one could imagine a smaller-scale version of setting and rewarding bounties that would work better if built into the site, e.g. for Question posts (to reward the best answer), or just to gift someone money for writing a particularly important post or comment.
That said, this is the kind of thing that, if implemented suboptimally, could easily incentivize detrimental behavior, instead.
I like the idea (Had it myself as well) of having a feature that lets users directly give monetary rewards to other users for comments and posts. One of the problems of a Reddit rewards style system is that it’s still internet points at the end of the day, and there’s a limitation for how much internet points can be worth.
On the other hand, money has obvious utility, and would definitely create an incentive to post very good comments and posts, and to give even more polish to very good comments and posts you would have written anyway. At the extreme, it could let particularly good and prolific users get some tangible revenue from their participation on LessWrong (a bit like having a patreon), which seems like a great thing to me.
I’m curious what you think are the suboptimal ways (and the optimal one) to implement this and what detrimental behavior they can incentivize?
Because comments are much less discoverable than posts, lots of high-effort high-value comments, even if they’re very-high-karma, get lost in the masses of LW comments and are hard to find or refer to later on.
One thing that was talked about in the past that could help is an option to tag comments with tags, and have them show up on tag pages somehow. There are pros and cons and implementation details that were talked about, but generally speaking I think this is an interesting suggestion and would like to see it tried.
On the one hand, I like the gesture of commenting nice but relatively empty stuff like “this was great”. On the other hand I dislike spam, and this feels kind of redundant with the karma system. Not sure what I think about this.
I’m curious what you think are the suboptimal ways (and the optimal one) to implement this and what detrimental behavior they can incentivize?
As one random example: I could pay 10$ to any comment which agrees with me, and to any comment which criticizes one of my own critics. I don’t even need to say this explicitly, and yet over time it would still absolutely warp discussions.
And what if a critic notices that and offers 20$ each? Then we’re suddenly in an arms race.
I’m not sure how to prevent failure modes like that.
Oh, well the way I imagined it the reward isn’t visible or influences how comments are shown. So I guess it could hypothetically create an incentive to make comments that agree with you, but if it’s not visible and has no direct influence that seems very unlikely. If it is then it’s more likely, but It still doesn’t seem like it would be a big problem. Maybe moderators can have automatic monitoring for that kind of thing like they have for mass up/down voting?
Suggestions for LW features that could shape its culture by (dis)incentivizing certain behavior (without thinking about how hard they would be to implement):
On How LW Appears to Outside Readers
There’s a certain kind of controversial post that inevitably generates meta-discussion of whether people should be allowed to post it here (most recently in this book review). Crucially, the arguments I see there are not “I don’t like this” but usually “I’m afraid of what will happen when people who don’t like this see it, and associate LW with it”. (I found this really tedious, and would prefer a culture where we stick our own heads out and stick to “I don’t like this” rather than appealing to third parties.) Also, I wish there were a way to preempt this objection, so as to not fight the same battle over and over.
Other posts are in dispute (as in, have an unusually high fraction of downvotes), like jessicata’s post, but a casual reader might only see the post (and maybe its positive karma score), with all the controversy and nuance happening in the utterly impenetrable comments section.
So what might one do about that?
Some Reddit-style sites compute a controversy score (percentage of downvotes). Then one could sort by controversy, or by default filter controversial posts from the frontpage, or visibly flag controversial posts in a way people not familiar with LW would understand, or tag as “controversial”, or something.
If controversial posts are rare in number, this problem can also be tackled by mod intervention. For instance, mods could have the power to put a disclaimer (/ disclaimer template) above a post, or these could be triggered automatically by some specific metrics. Some (bad) examples:
“This post has a notable fraction of downvotes, and a high number of nested comment threads. This indicates that it’s in dispute. Check the comments for details.”
Or: a manual or automatic disclaimer on high-traffic non-frontpaged private blog posts (like the book review) to indicate to readers who come from elsewhere: This post is not frontpaged. LW has less strict moderation standars for private blogs. The karma score (upvotes) of posts reflect positive sentiment towards the contribution by the poster (e.g. taking the time to write a review), they’re not automatically an endorsement of <the reviewed item>. And so on.
Handling High-Stakes Controversies
Duncan’s post notes various ways in which recent controversies were not handled optimally. Some (probably bad) suggestions for site features to help handle such situations better:
The disclaimer thing, so new readers know that a post is controversial before they read it and take its claims at face value. This might also be warranted for some high-karma controversial comments.
Moderation
Mods could try to “turn down the heat” by setting stricter commenting guidelines, or temporarily prevent users below some karma threshold from posting, or something.
Flagging important clarifying comment threads so they appear higher in the comment order, or with extra highlighting.
Mark comments by moderators-acting-as-moderators with a flair or other highlighting.
On-site low-friction ways to post anonymously, with the option of leaking some non-identifying information like “I’ve been a LW user for >3 years with >1k karma”, or to ping specific LW users with “user X can vouch for my identity”, which user X could then confirm with a single click. Though I would not want this anonymity feature available on non-controversial posts.
A feature for a user to “request moderation for this post”, or “request stricter commenting guidelines” or to indicate “this post seems controversial to me” or something.
Of course such features don’t solve a problem by themselves, but they can help, and I’m more optimisic of attempts to improve a culture if the site infrastructure supports those attempts and incentivizes that improved culture.
Rewarding Exceptional Content
As noted in the LW book review bounty program, exceptional content on LW is potentially very valuable, so it makes sense to incentivize it. The karma system helps, but it’s not enough—as can be seen when extra incentives come into play, e.g. the extra reviews generated by the bounty program.
Some features beyond the karma system that could help here:
Reddit nowadays has a separate “awards” system which users can use to reward exceptional content. I don’t like the specific implementation at all—it’s full of one-upmanship of progressively more expensive awards, and posts with lots of awards just look cluttered—but one could imagine an implementation that would work here.
For instance, there are a number of active bounties on LW, but one could imagine a smaller-scale version of setting and rewarding bounties that would work better if built into the site, e.g. for Question posts (to reward the best answer), or just to gift someone money for writing a particularly important post or comment.
That said, this is the kind of thing that, if implemented suboptimally, could easily incentivize detrimental behavior, instead.
Or (like mentioned in the Controversies section), what about a system for users or mods to flag particularly high-quality or high-importance comments so they get some extra highlighting or something?
Related:
Because comments are much less discoverable than posts, lots of high-effort high-value comments, even if they’re very-high-karma, get lost in the masses of LW comments and are hard to find or refer to later on.
What could be done about that?
For instance, if users or mods see a comment thread of exceptional and enduring value, they could flag it as such (I already occasionally see follow-up comments of the form “This is good enough for a top-level post!”), and then others (volunteers or paid contributors) could turn the best ones of those into top-level posts, with karma going to the original posters.
(To end on a meta comment: I spent >2.5h on three high-effort comments in this thread, and would be disappointed if they got lost in the shuffle. Conversely, I’m more likely to make the effort in the future if I have a sense that it paid off in some way.)
First of all, thanks for your three comments, I think they provide valuable analysis and suggestions.
Another thing I notice about controversial posts: Because of how the front page works, posts that get a lot of comments get more exposure, because they show up in recent discussion, while posts that are correct and valuable but uncontroversial are likely to get less comments (even if everyone who upvotes them left a simple “good post” comment) unless they somehow manage to generate discussion.
I’m not entirely sure if it a “bug” or a “feature”. On the one hand, posts that are agreed to be good and valuable get drowned out, on the other hand, perhaps it’s exactly the controversial posts that deserve attention to resolve the controversy?
One way to counteract that is to leave more simple, generic comments like “Thanks for writing this”, “This was great”, “I enjoyed reading this”, etc. People (including me sometimes when I consider making them) worry about not adding anything substantial, but I think that’s not a problem, I like seeing these comments and the Karma system should help get the more substantial comments near the top.
That’s a social suggestion though, rather than a feature that can be implemented in the website. I don’t have an idea for a feature that could deal with that (the main ones currently are the ‘magic’ sorting on the frontpage, curation, and the frontpage recommendations, but I don’t think the latter two have a big impact on this).
I like the idea (Had it myself as well) of having a feature that lets users directly give monetary rewards to other users for comments and posts. One of the problems of a Reddit rewards style system is that it’s still internet points at the end of the day, and there’s a limitation for how much internet points can be worth.
On the other hand, money has obvious utility, and would definitely create an incentive to post very good comments and posts, and to give even more polish to very good comments and posts you would have written anyway. At the extreme, it could let particularly good and prolific users get some tangible revenue from their participation on LessWrong (a bit like having a patreon), which seems like a great thing to me.
I’m curious what you think are the suboptimal ways (and the optimal one) to implement this and what detrimental behavior they can incentivize?
One thing that was talked about in the past that could help is an option to tag comments with tags, and have them show up on tag pages somehow. There are pros and cons and implementation details that were talked about, but generally speaking I think this is an interesting suggestion and would like to see it tried.
On the one hand, I like the gesture of commenting nice but relatively empty stuff like “this was great”. On the other hand I dislike spam, and this feels kind of redundant with the karma system. Not sure what I think about this.
As one random example: I could pay 10$ to any comment which agrees with me, and to any comment which criticizes one of my own critics. I don’t even need to say this explicitly, and yet over time it would still absolutely warp discussions.
And what if a critic notices that and offers 20$ each? Then we’re suddenly in an arms race.
I’m not sure how to prevent failure modes like that.
Oh, well the way I imagined it the reward isn’t visible or influences how comments are shown. So I guess it could hypothetically create an incentive to make comments that agree with you, but if it’s not visible and has no direct influence that seems very unlikely. If it is then it’s more likely, but It still doesn’t seem like it would be a big problem. Maybe moderators can have automatic monitoring for that kind of thing like they have for mass up/down voting?