Also, for me karma has a definitive role as a motivator to make intelligent, well-thought out comments and posts.
Indeed, when a post of mine gets high karma I can’t help but think “wow, I’m brilliant”. However, I’m afraid that the reality of karma doesn’t quite match our ideas about it. Your conscious understanding of the effect of karma on your commenting is that it motivates you to make intelligent, well-thought out comments. But if you’re like a typical human, and I think you are, then you are adapting your writing, largely without conscious awareness, in the direction of whatever maximizes karma, even when that conflicts with maximizing rationality, and, I think, it quite often does. I’ve seen karma turn other sites useless. I think that there must be some way to either improve or replace karma systems, so that their intended function is performed, but without ultimately ruining the forum.
But you know what, I’m not sure that if karma could be improved to better serve its supposed purpose, it would be. Karma, as currently implemented, is status, and people love status, they love pecking orders. So, even though karma-as-status serves our darker selves, it’s probably not going to go away.
A more genuinely useful rating system would probably move in the direction of the Netflix rating system. But that system would not give a person an overall score, and therefore rank, within the community. Rather, it would score a person differently depending on who was looking at the score. The score I saw would not be the score somebody else saw.
And there would be no gradual accumulation of karma over time, which mimics seniority. People love their seniority. It’s a much loved form of status. Instead, in a Netflix-like system, participants would quickly, almost immediately, achieve a karma matching that of long-time participants. Nobody would want that. People want their status, their pecking order, so even though it doesn’t genuinely serve rationality, that’s probably what they’re going to want to stick with.
I think the karma system on LessWrong works surprisingly well, as long as people remember that “Vote up” and “Vote down” means “more like this” and “less like this”, rather than “agree/disagree”. There are standard beliefs and some groupthink, but you can still get upvoted for quite cutting criticisms if you show in your comment that you’ve done your homework and understand what you’re objecting to.
I don’t think there’s anything broken about the current system. Certainly the comments on LessWrong are exceedingly high quality in general, particularly compared to pretty much any other site.
I believe that the quality of the comments could very easily be independent of the existence of the karma system and dependent, instead, on the high quality and low number of participants. It might well be that pretty much any crude moderation system would work about as well. I remember certain Usenet groups which were quite high quality, in particular comp.ai.philosophy (I think it was called), back around 1991 or so. I had some satisfying discussions there, at quite a high level. So, it’s not as though high quality conversation was not to be had in an unmoderated forum, provided the participants were sufficiently few and sufficiently good, which was I think largely achieved in that group. In larger groups there was more noise.
Karma depends on voters, so a low-population forum will not be much affected by karma. Karma really kicks in, really affects what goes on, when the number of participants goes up. And what happened at Digg and Reddit are examples of what I expect to happen anywhere where the forum explodes. Karma becomes a powerful tool of groupthink, firmly establishing an echo chamber.
I suspect this is part of the normal lifecycle of Internet forums. A Group Is Its Own Worst Enemy by Clay Shirky is the standard work on the topic.
Contrariwise, a group norm against status rankings does not stop them happening—it just means they form where you’re not looking and bite you in the backside. The Tyranny of Structurelessness by Jo Freeman is the standard work on this topic.
To summarise the summary: people remain a problem.
Indeed, when a post of mine gets high karma I can’t help but think “wow, I’m brilliant”. However, I’m afraid that the reality of karma doesn’t quite match our ideas about it. Your conscious understanding of the effect of karma on your commenting is that it motivates you to make intelligent, well-thought out comments. But if you’re like a typical human, and I think you are, then you are adapting your writing, largely without conscious awareness, in the direction of whatever maximizes karma, even when that conflicts with maximizing rationality, and, I think, it quite often does. I’ve seen karma turn other sites useless. I think that there must be some way to either improve or replace karma systems, so that their intended function is performed, but without ultimately ruining the forum.
But you know what, I’m not sure that if karma could be improved to better serve its supposed purpose, it would be. Karma, as currently implemented, is status, and people love status, they love pecking orders. So, even though karma-as-status serves our darker selves, it’s probably not going to go away.
A more genuinely useful rating system would probably move in the direction of the Netflix rating system. But that system would not give a person an overall score, and therefore rank, within the community. Rather, it would score a person differently depending on who was looking at the score. The score I saw would not be the score somebody else saw.
And there would be no gradual accumulation of karma over time, which mimics seniority. People love their seniority. It’s a much loved form of status. Instead, in a Netflix-like system, participants would quickly, almost immediately, achieve a karma matching that of long-time participants. Nobody would want that. People want their status, their pecking order, so even though it doesn’t genuinely serve rationality, that’s probably what they’re going to want to stick with.
I think the karma system on LessWrong works surprisingly well, as long as people remember that “Vote up” and “Vote down” means “more like this” and “less like this”, rather than “agree/disagree”. There are standard beliefs and some groupthink, but you can still get upvoted for quite cutting criticisms if you show in your comment that you’ve done your homework and understand what you’re objecting to.
I don’t think there’s anything broken about the current system. Certainly the comments on LessWrong are exceedingly high quality in general, particularly compared to pretty much any other site.
I believe that the quality of the comments could very easily be independent of the existence of the karma system and dependent, instead, on the high quality and low number of participants. It might well be that pretty much any crude moderation system would work about as well. I remember certain Usenet groups which were quite high quality, in particular comp.ai.philosophy (I think it was called), back around 1991 or so. I had some satisfying discussions there, at quite a high level. So, it’s not as though high quality conversation was not to be had in an unmoderated forum, provided the participants were sufficiently few and sufficiently good, which was I think largely achieved in that group. In larger groups there was more noise.
Karma depends on voters, so a low-population forum will not be much affected by karma. Karma really kicks in, really affects what goes on, when the number of participants goes up. And what happened at Digg and Reddit are examples of what I expect to happen anywhere where the forum explodes. Karma becomes a powerful tool of groupthink, firmly establishing an echo chamber.
I suspect this is part of the normal lifecycle of Internet forums. A Group Is Its Own Worst Enemy by Clay Shirky is the standard work on the topic.
Contrariwise, a group norm against status rankings does not stop them happening—it just means they form where you’re not looking and bite you in the backside. The Tyranny of Structurelessness by Jo Freeman is the standard work on this topic.
To summarise the summary: people remain a problem.