Sometimes meta-debate can be good, productive, or necessary.… If you want to maintain discussion norms, sometimes you do have to have discussions about who’s violating them. I even think it can sometimes be helpful to argue about which side is the underdog.
But it’s not the debate, and also it’s much more fun than the debate. It’s an inherently social question, the sort of who’s-high-status and who’s-defecting-against-group-norms questions that we like a little too much. If people have to choose between this and some sort of boring scientific question about when fetuses gain brain function, they’ll choose this every time; given the chance, meta-debate will crowd out everything else.
This is a major thing we’re trying to address with LW2. But I notice a bit of a sense-of-doom about it, and just had some thoughts.
I was reading the Effective Altruism forum today, and saw a series of posts on the cost effectiveness of vaccines. It looked like decent original research, and in many senses it seems more important than most of the other stuff getting discussed (on either the EA forum or on LW). Outputting research like that seems like one of the core things EA should actually be trying to do. (More specifically – translating that sort of knowledge into impact.)
But, it’s way less fun to talk about – you need to actually be a position to either offer worthwhile critiques of the information there, or to make use of the information.
(Did I read it myself? No. Lol)
And you can maybe try to fix this by making that sort of research high status – putting it in the curated section, giving out bonus karma, maybe even cash prizes. But I think it’ll continue to *feel* less rewarding than something that results in actual comments.
My current thought is that the thing that’s missing here is a part of the pipeline that clearly connects research to people who are actually going to do something with it. I’m not sure what to do with that
And you can maybe try to fix this by making that sort of research high status – putting it in the curated section, giving out bonus karma, maybe even cash prizes. But I think it’ll continue to feel less rewarding than something that results in actual comments.
Figure out what sorts of user behavior you wish to incentivize (reading posts people wouldn’t otherwise read? commenting usefully on those posts? making useful posts?), what sorts you wish to limit (posting, in general? snarky comments?), and apply EP/GP.
In Varieties of Argument, Scott Alexander notes:
This is a major thing we’re trying to address with LW2. But I notice a bit of a sense-of-doom about it, and just had some thoughts.
I was reading the Effective Altruism forum today, and saw a series of posts on the cost effectiveness of vaccines. It looked like decent original research, and in many senses it seems more important than most of the other stuff getting discussed (on either the EA forum or on LW). Outputting research like that seems like one of the core things EA should actually be trying to do. (More specifically – translating that sort of knowledge into impact.)
But, it’s way less fun to talk about – you need to actually be a position to either offer worthwhile critiques of the information there, or to make use of the information.
(Did I read it myself? No. Lol)
And you can maybe try to fix this by making that sort of research high status – putting it in the curated section, giving out bonus karma, maybe even cash prizes. But I think it’ll continue to *feel* less rewarding than something that results in actual comments.
My current thought is that the thing that’s missing here is a part of the pipeline that clearly connects research to people who are actually going to do something with it. I’m not sure what to do with that
Figure out what sorts of user behavior you wish to incentivize (reading posts people wouldn’t otherwise read? commenting usefully on those posts? making useful posts?), what sorts you wish to limit (posting, in general? snarky comments?), and apply EP/GP.