One person at the Paris meetup made the really interesting and AFAICT accurate observation that the more prominent a Less Wrong post was, the less likely it was to be high quality—ie comments are better than Discussion posts are better than Main (with several obvious and honorable exceptions).
I think maybe it has to do with the knowledge that anything displayed prominently is going to have a bunch of really really smart people swarming all over it and critiquing it and making sure you get very embarrassed if any of it is wrong. People avoid posting things they’re not sure about, and so the things that get main-ed tend to be restatements of things that create pleasant feelings in everyone reading them without rocking any conceivable boat, and the sort of overly meta- topics you’re talking about lend themselves to those restatements—for example “We should all be more willing to try new things!” or “Let’s try to be more alert for biases in our everyday life!”
Potential cures include greater willingness to upvote posts that are interesting but non-perfect, greater willingness to express small disagreements in “IAWYC but” form, and greater willingness to downvote posts that are applause lights or don’t present non-obvious new material. I’m starting to do this, but hitting that downvote button when there’s nothing objectively false or stupid about a post is hard.
I agree that theoretical-sciency-mathy-insightful stuff is less common now than when Eliezer was writing posts regularly. I suspect this is largely because writing such posts is hard. Few people have that kind of knowledge, thinking ability, and writing skills, and the time to do the writing.
As someone who spends many hours writing posts only to have them nit-picked to death by almost everyone who bothers to comment, I appreciate your advice to “express small disagreements in ‘IAWYC but’ form.”
As for your suggestion to downvote posts that “don’t present non-obvious new material,” I’m not sure what to think about that. My recent morality post probably contains only material that is obvious to someone as thoroughly familiar with LW material as yourself or Phil Goetz or Will Newsome or Vladimir Nesov or many others, but on the other hand a great many LWers are not quite that familiar, or else haven’t taken the time to apply earlier lessons to a topic like morality (and were thus confused when Eliezer skipped past these basics and jumped right into ‘Empathic Metaethics’ in his own metaethics sequence).
I’m starting to do this, but hitting that downvote button when there’s nothing objectively false or stupid about a post is hard.
I don’t find it hard, but whenever I vote a comment below zero for not adding anything, it just gets fixed back to zero by someone who probably wouldn’t have voted otherwise.
I had implicitly resignedly assumed that the bland re-presentation of old material and applause light posts were part of a consciously directed memetic strategy. Apparently I’d underestimated the size of the disgruntled faction. From now on I will be less merciful with my downvoting button.
As the author of one of the rehash posts, I agree that these sorts of topics are generally pretty boring and uninteresting to read. There’s nothing surprising or new in them, and they seem pretty obvious when you read them.
But the point (of mine at least) wasn’t really to expose any new material, so much as to try to push people into doing something useful. As far as I can tell, a large portion of the readers of LW don’t implement various easy life-improvement methods, and it was really more intended as a push to encourage people to use them.
On the one hand, a lot of interesting stuff on LW is “applied rationality” and its really fun to read, but I’m fairly skeptical as to how useful it is for most people. There’s nothing wrong with it being interesting and fun, but there are other things to talk about.
One part of what’s going on may be that the site allows anyone to register and vote, and so there’s a feedback loop where people who are less like the core demographic and more like the rest of the internet come in and vote for posts that appeal more to average people from the internet, which in turn causes more average people from the internet to register and vote, and all this creates a pressure for the site to want to become every other site.
Another part of what’s going on may be that the site has been focusing more and more on the idea that rationality gives you easy and obvious personal superpowers (as opposed to just helping you figure out what goal to strive toward and with what strategies), and while I’m not saying there’s no truth to that, it doesn’t strike me as being why most of us originally got interested in these issues, and a lot of the support for it feels like it was selected to support an easily-marketable bottom line.
Agreed.
One person at the Paris meetup made the really interesting and AFAICT accurate observation that the more prominent a Less Wrong post was, the less likely it was to be high quality—ie comments are better than Discussion posts are better than Main (with several obvious and honorable exceptions).
I think maybe it has to do with the knowledge that anything displayed prominently is going to have a bunch of really really smart people swarming all over it and critiquing it and making sure you get very embarrassed if any of it is wrong. People avoid posting things they’re not sure about, and so the things that get main-ed tend to be restatements of things that create pleasant feelings in everyone reading them without rocking any conceivable boat, and the sort of overly meta- topics you’re talking about lend themselves to those restatements—for example “We should all be more willing to try new things!” or “Let’s try to be more alert for biases in our everyday life!”
Potential cures include greater willingness to upvote posts that are interesting but non-perfect, greater willingness to express small disagreements in “IAWYC but” form, and greater willingness to downvote posts that are applause lights or don’t present non-obvious new material. I’m starting to do this, but hitting that downvote button when there’s nothing objectively false or stupid about a post is hard.
I agree that theoretical-sciency-mathy-insightful stuff is less common now than when Eliezer was writing posts regularly. I suspect this is largely because writing such posts is hard. Few people have that kind of knowledge, thinking ability, and writing skills, and the time to do the writing.
As someone who spends many hours writing posts only to have them nit-picked to death by almost everyone who bothers to comment, I appreciate your advice to “express small disagreements in ‘IAWYC but’ form.”
As for your suggestion to downvote posts that “don’t present non-obvious new material,” I’m not sure what to think about that. My recent morality post probably contains only material that is obvious to someone as thoroughly familiar with LW material as yourself or Phil Goetz or Will Newsome or Vladimir Nesov or many others, but on the other hand a great many LWers are not quite that familiar, or else haven’t taken the time to apply earlier lessons to a topic like morality (and were thus confused when Eliezer skipped past these basics and jumped right into ‘Empathic Metaethics’ in his own metaethics sequence).
I enjoyed your morality post, as I do most of your posts, and certainly wouldn’t accuse it of not presenting non-obvious new material.
I don’t find it hard, but whenever I vote a comment below zero for not adding anything, it just gets fixed back to zero by someone who probably wouldn’t have voted otherwise.
I had implicitly resignedly assumed that the bland re-presentation of old material and applause light posts were part of a consciously directed memetic strategy. Apparently I’d underestimated the size of the disgruntled faction. From now on I will be less merciful with my downvoting button.
As the author of one of the rehash posts, I agree that these sorts of topics are generally pretty boring and uninteresting to read. There’s nothing surprising or new in them, and they seem pretty obvious when you read them.
But the point (of mine at least) wasn’t really to expose any new material, so much as to try to push people into doing something useful. As far as I can tell, a large portion of the readers of LW don’t implement various easy life-improvement methods, and it was really more intended as a push to encourage people to use them.
On the one hand, a lot of interesting stuff on LW is “applied rationality” and its really fun to read, but I’m fairly skeptical as to how useful it is for most people. There’s nothing wrong with it being interesting and fun, but there are other things to talk about.
Perhaps it would be easier and/or more constructive to comment ‘I don’t disagree with anything here, but I don’t think this is valuable’?
Perhaps, but I expect far fewer people would do so: it’s less anonymous and more likely to cause confrontations/bad feelings.
Sounds like a great time to invoke some strategic applied sociopathy.
Well-Kept Gardens Die By Pacifism seems particularly relevant here.
One part of what’s going on may be that the site allows anyone to register and vote, and so there’s a feedback loop where people who are less like the core demographic and more like the rest of the internet come in and vote for posts that appeal more to average people from the internet, which in turn causes more average people from the internet to register and vote, and all this creates a pressure for the site to want to become every other site.
Another part of what’s going on may be that the site has been focusing more and more on the idea that rationality gives you easy and obvious personal superpowers (as opposed to just helping you figure out what goal to strive toward and with what strategies), and while I’m not saying there’s no truth to that, it doesn’t strike me as being why most of us originally got interested in these issues, and a lot of the support for it feels like it was selected to support an easily-marketable bottom line.