The biggest problem as I see it is loss of members and lack of talented new ones. I’m willing to bet if you plotted the histogram of user activity, you’d see virtually all of the posts and comments coming from a very small number of members. The ‘Top Contributors’ section in the side panel probably contains most of them, and has been relatively stable in composition over the past two years. If the first step in instrumental rationality is to identify reality in an objective way, then we have to realize this site has become an echo chamber for a small few, with their own vocabulary and system of thought which is incompatible with the outside world. The barriers to entry (reading the sequences, reading the ‘seminal’ comment threads, etc.) are too high for most people. HPMOR offers a pathway for new members to start reading the rationality materials, but it doesn’t equip them to meaningfully contribute.
Another thing is that there is only so much ‘low-hanging fruit’ lying around. In terms of general rationality, we’ve covered most everything. There are only so many threads you can have about biases and logical inconsistencies. The topic has become quite stale.
I like AI because at least it offers the possibility for new material to arise every once in a while, leading to useful discussions. Other people might like other topics. I have a huge list of topics whose implications are quite relevant to the art of rationality and so would be quite compatible with the goals of this site:
Thermodynamics
Neuroscience
Neural Networks
Social organization & Forms of government
Human sexual dynamics
The problem is, the types of people likely to be knowledgeable about these things probably have no idea this site exists. And if they do it is unreasonable to expect them to learn the required information to be ‘on the same page’ as this site’s core users. And this is very bad, because it means that this site’s users will attempt to foray into these topics themselves without any help from actual experts.
I don’t know what the solution is. Maybe it’s already too late to do anything.
In regards to the article, the idea of subsections or seperations of topics has already been raised in this post. I personally think it would make more sense to be able to create personal sections based on query results. So, it would just be a saved search query. It would be a section that you can name that would be shown only to you and would run a search when you enter it. The search would just be a normal search that you can run now.
Some ideas I have had on improving the site are:
An announcements sub-section. One reason for why less people might be coming to the site is that the Main is clogged up with Meetup announcements.
Optional comment when liking or disliking. The like/dislike status is useful with helping to sort posts in terms of quality, but it doesn’t really help improve existing posts. If your post, gets disliked, then most of the time you don’t know why it was disliked. If your post was disliked with a comment, then you could spend effort to fix the post, reply to the comment and then the disliker would hopefully reneg the dislike. This would lead to you wanting to improve posts once they have been created. The comment would only be sent to article creator.
Section with post karma that does not affect your personal karma. This would help with new members who start posting lower quality posts and then give up because their user has poor karma. I think your ability to post in discussion or main should be based on the post and comment karma. This means that posts in the section without personal karma would, if liked enough, allow you to post in main.
In regards to the above comment, this is what I think of your points:
The barriers to entry (reading the sequences, reading the ‘seminal’ comment threads, etc.) are too high for most people—I agree. This is why I have started writing a rationality primer. I will post an article with more details on this maybe next week.
There is only so much ‘low-hanging fruit’ lying around. In terms of general rationality—while this is true, I think that there is still alot of area that could be covered in terms of applied general rationality. By this I mean practical advice. This also doesn’t require expert domain knowledge. You just need to go out and actually try something. I think that less wrong can improve in this area.
I like AI because at least it offers the possibility for new material to arise every once in a while—I think that there is alot of content that is tangentially related to rationality that could be extremely helpful. Once again, I think a more practical focus would get people thinking in more divergent directions. Personally, I plan to write some sequences around the idea of strategy it would touch on Mental models, Complexity theory, Systems dynamics, Boydian thinking and maybe some other stuff .
I’d go even further and make the comment mandatory. Or remove likes/dislikes altogether and use ratings like “irrational” or “off-topic” or “I personally disagree” instead, along with the ability to add some more explanation.
My thought was that most people who do make the distinction between “I personally disagree” and “you don’t provide a proper argument” are also the people who are unlikely to vote down simply because they disagree. I could be wrong.
You can imagine that I clicked “I personally disagree” on your comment.
The problem is, unless I also explain why, I will seem like a jerk. But if I explain, it will cost me time, so naturally after writing a few careful explanations I will simply stop downvoting.
The suggestion for different types of rating is a particularly good idea. Why you were downvoted isn’t always obvious, and this makes it hard to get feedback from a downvote. I think too many people downvote based upon their agreement with a commenter’s position rather than how well it is justified, which isn’t how I think the system should work. I’ve upvoted a number of comments I disagree with but thought were argued well.
We should somehow survey the users to see if their alone-in-a-crowd attitude prevents them from actively posting, and then encourage them to post, because compartmentalization is not a good habit. And I think people who agree that you should help the epileptic however many people are also present, but not that you should speak out ‘what is obvious’ if there are other users with a history of commenting, compartmentalize.
(OTOH, maybe it’s me rationalizing asking much and answering little.)
The biggest problem as I see it is loss of members and lack of talented new ones. I’m willing to bet if you plotted the histogram of user activity, you’d see virtually all of the posts and comments coming from a very small number of members. The ‘Top Contributors’ section in the side panel probably contains most of them, and has been relatively stable in composition over the past two years. If the first step in instrumental rationality is to identify reality in an objective way, then we have to realize this site has become an echo chamber for a small few, with their own vocabulary and system of thought which is incompatible with the outside world. The barriers to entry (reading the sequences, reading the ‘seminal’ comment threads, etc.) are too high for most people. HPMOR offers a pathway for new members to start reading the rationality materials, but it doesn’t equip them to meaningfully contribute.
Another thing is that there is only so much ‘low-hanging fruit’ lying around. In terms of general rationality, we’ve covered most everything. There are only so many threads you can have about biases and logical inconsistencies. The topic has become quite stale.
I like AI because at least it offers the possibility for new material to arise every once in a while, leading to useful discussions. Other people might like other topics. I have a huge list of topics whose implications are quite relevant to the art of rationality and so would be quite compatible with the goals of this site:
Thermodynamics
Neuroscience
Neural Networks
Social organization & Forms of government
Human sexual dynamics
The problem is, the types of people likely to be knowledgeable about these things probably have no idea this site exists. And if they do it is unreasonable to expect them to learn the required information to be ‘on the same page’ as this site’s core users. And this is very bad, because it means that this site’s users will attempt to foray into these topics themselves without any help from actual experts.
I don’t know what the solution is. Maybe it’s already too late to do anything.
In regards to the article, the idea of subsections or seperations of topics has already been raised in this post. I personally think it would make more sense to be able to create personal sections based on query results. So, it would just be a saved search query. It would be a section that you can name that would be shown only to you and would run a search when you enter it. The search would just be a normal search that you can run now.
Some ideas I have had on improving the site are:
An announcements sub-section. One reason for why less people might be coming to the site is that the Main is clogged up with Meetup announcements.
Optional comment when liking or disliking. The like/dislike status is useful with helping to sort posts in terms of quality, but it doesn’t really help improve existing posts. If your post, gets disliked, then most of the time you don’t know why it was disliked. If your post was disliked with a comment, then you could spend effort to fix the post, reply to the comment and then the disliker would hopefully reneg the dislike. This would lead to you wanting to improve posts once they have been created. The comment would only be sent to article creator.
Section with post karma that does not affect your personal karma. This would help with new members who start posting lower quality posts and then give up because their user has poor karma. I think your ability to post in discussion or main should be based on the post and comment karma. This means that posts in the section without personal karma would, if liked enough, allow you to post in main.
In regards to the above comment, this is what I think of your points:
The barriers to entry (reading the sequences, reading the ‘seminal’ comment threads, etc.) are too high for most people—I agree. This is why I have started writing a rationality primer. I will post an article with more details on this maybe next week.
There is only so much ‘low-hanging fruit’ lying around. In terms of general rationality—while this is true, I think that there is still alot of area that could be covered in terms of applied general rationality. By this I mean practical advice. This also doesn’t require expert domain knowledge. You just need to go out and actually try something. I think that less wrong can improve in this area.
I like AI because at least it offers the possibility for new material to arise every once in a while—I think that there is alot of content that is tangentially related to rationality that could be extremely helpful. Once again, I think a more practical focus would get people thinking in more divergent directions. Personally, I plan to write some sequences around the idea of strategy it would touch on Mental models, Complexity theory, Systems dynamics, Boydian thinking and maybe some other stuff .
I’d go even further and make the comment mandatory. Or remove likes/dislikes altogether and use ratings like “irrational” or “off-topic” or “I personally disagree” instead, along with the ability to add some more explanation.
+5 insightful, +5 funny :-/
It’s not a new idea and it has been tried. Slashdot spent years experimenting with different karma models—that is valuable empirical data.
Irrational; off-topic; trite; redundant; attitude; poorly written/expressed; I personally disagree.
Rational; interesting/clever; positive vibe man!; I don’t really care, but feel that you have been unfairly downvoted.
Problem: I suspect that most people who down vote because “I personally disagree” might actually click “irrational”.
I think on LW a lot of people can distinguish “I personally disagree” from “you don’t provide a proper argument”.
My thought was that most people who do make the distinction between “I personally disagree” and “you don’t provide a proper argument” are also the people who are unlikely to vote down simply because they disagree. I could be wrong.
I think it depend on the context. There are cases on LW when voting down because you disagree makes sense and others where it doesn’t.
This is likely, and requiring a comment or some additional feedback might be a good idea for calling something irrational.
You can imagine that I clicked “I personally disagree” on your comment.
The problem is, unless I also explain why, I will seem like a jerk. But if I explain, it will cost me time, so naturally after writing a few careful explanations I will simply stop downvoting.
The suggestion for different types of rating is a particularly good idea. Why you were downvoted isn’t always obvious, and this makes it hard to get feedback from a downvote. I think too many people downvote based upon their agreement with a commenter’s position rather than how well it is justified, which isn’t how I think the system should work. I’ve upvoted a number of comments I disagree with but thought were argued well.
I find it amusing that the comment above this one was downvoted without explanation.
We should somehow survey the users to see if their alone-in-a-crowd attitude prevents them from actively posting, and then encourage them to post, because compartmentalization is not a good habit. And I think people who agree that you should help the epileptic however many people are also present, but not that you should speak out ‘what is obvious’ if there are other users with a history of commenting, compartmentalize.
(OTOH, maybe it’s me rationalizing asking much and answering little.)