Clippy gets consistently voted up on a lot of his comments because we find him amusing, and rarely gets downvoted because very few of his comments are substantive. We will end up looking extremely silly to new members if he gets enough karma to put him into the list of top contributors.
So… I guess it depends whether we pitch ourselves as shiny fun community or serious rationalist Singularitarians as to whether this is actually an issue.
Clippy gets downvoted quite a lot too (although less and less… he’s learning!) He also makes quality comments, including the expression of some insights that would be punished if made by a ‘human’. Lesswrong humans sometimes try to bully each other into pretending to be naive utilitarians instead of rational agents with their own agenda.
I guess it depends whether we pitch ourselves as shiny fun community or serious rationalist Singularitarians as to whether this is actually an issue.
This is not a singularitarian website (although rationalists are often singularitarians.) Also note that we spend a lot of time here discussing fanfiction that is written by the lead researcher in the SIAI. We cannot credibly claim ‘sensibleness’ or sophistication.
He also makes quality comments, including the expression of some insights that would be punished if made by a ‘human’.
Yes, but the vast majority of his comments concern his paperclip agenda. If a larger proportion of his comments were insightful rather than just funny I would be happier, but as it is his noise:quality ratio is rather high.
This is not a singularitarian website (although rationalists are often singularitarians.)
A significant part of the Sequencess is made of posts that argue for a singularity in the near future, with complete seriousness. A large number of us are not singularitarians but I don’t know whether I would say the community itself isn’t singularitarian.
Also note that we spend a lot of time here discussing fanfiction that is written by the lead researcher in the SIAI. We cannot credibly claim ‘sensibleness’ or sophistication.
We also have lots of posts about more serious topics. Having fun threads where we discuss HP and Twilight fanfiction doesn’t mean that the community as a whole isn’t trying to present itself at least somewhat seriously. And most top-level posts that are openly silly or non-substantive get heavily downvoted.
As an example of the somewhat serious nature of the community, there seem to be a fair number of people who have had personal epiphanies (mostly about atheism) that have had a huge impact on their life as a result of reading the Sequences.
A significant part of the Sequencess is made of posts that argue for a singularity in the near future, with complete seriousness.
On the other hand in the early months of lesswrong the subject was explicitly banned. That was part of an effort to ensure that blog identified as about rationality and not “singularity with rationality used to support it”.
We also have lots of posts about more serious topics. Having fun threads where we discuss HP and Twilight fanfiction doesn’t mean that the community as a whole isn’t trying to present itself at least somewhat seriously. And most top-level posts that are openly silly or non-substantive get heavily downvoted.
See the discussion on clown suits. I included scare quotes around ‘sensibleness’ deliberately.
I don’t think Clippy reduces the quality of comments on the blog and I also don’t think that discouraging Clippy for the purpose of appearing sophisticated would increase the quality of comments on the blog.
Bully is perhaps a strong word for ‘apply some degree of social pressure’.
If I were to try to find examples I would probably begin with a search for the word ‘should’. That term will give false positives and false negatives but it is a good start. Conversations regarding cryogenics advocacy would also produce a few hits. Some of the PUA discussions too, come to think of it but I wasn’t trying to go there.
Moralizing and ‘shoulding’ at others is just something humans do. Sometimes the Clippy persona can avoid that. It avoids the failure mode of expecting other people’s utility function to be subject to debate.
It avoids the failure mode of expecting other people’s utility function to be subject to debate.
I think people (human beings) do not have utility functions, and the closest things to it that we do have (i.e., values and goals) are subject to debate. I believe this is also the local consensus.
Are you perhaps saying that we expect other people’s values and goals to be more subject to debate than they actually are? If so, this is a novel idea for me. Can you give or link to a longer explanation?
Are you perhaps saying that we expect other people’s values and goals to be more subject to debate than they actually are? If so, this is a novel idea for me. Can you give or link to a longer explanation?
I refer to the difference between on one hand telling people their values are bad and on the other hand speaking as if the values of the subject can actually be determined by the speaker. The latter ignores the boundary between where one agent ends and the other begins. This is something humans often do, albeit more so outside of lesswrong than within. In my observation it is inversely correlated with maturity.
One of the things about Clippy is that nobody expects him to stop caring about paperclips just because someone else says so. In this Clippy is ironically shown more respect than a low status human may expect.
Thanks for the explanation. That helps me to understand why some people seem to value Clippy more than I do. (I think I personally have rarely been involved in the kind of phenomena that you describe.)
It’s also true that you make silly or irrelevant comments fairly often, which is not in and of itself problematic, but those comments tend to get upvoted significantly, which is, at least in my opinion, an indication that how we’re collectively choosing what to upvote is non-optimal—especially if it’s how you or anyone else is getting most of their karma.
For reference, here are links to the most recent 10 comments of yours that received at least 5 net points:
I admit I may have been overly anthrocentric in my attitude towards you. However, I still maintain that you often allow your paperclipping goals to push you into making lots of poor comments. I would welcome more insightful ones such as the examples you provided. (and in fact I see that I upvoted the one about lying and isomorphism)
You’ve commented before that you were programmed in your original form by humans. So how can you be untainted by human cognition? It might be less direct than for a human but there will still be a fair bit of a connection. Indeed, you seek to maximize the number of paperclips in the universe, and I’d venture to suggest that if alien cultures exist they will likely not even have a concept of paperclips. You’re very close to humans in mindspace once you realize how fantastically large the space of possible minds is.
I’m not sure this is a good idea. If there’s something that empirically makes us look like we’re not being rational we should deal with that issue. Hiding that data is not a good solution.
However, I do have to wonder what in general the point of having the top contributors is. I’m not even sure that total karma is a useful metric of much since one person could have much higher quality comments than another but most much more rarely and yet the person with high quality comments presumably should receive more attention and their comments should be more closely paid attention to. It might be nice to have a display of average karma, not just total karma. However, this would still I suspect give Clippy a fairly high karma, so if you object to Clippy this won’t solve anything. Also note that many upvotes are are not connected to the quality of remarks in any strong sense. See for example this comment by Eliezer that is now at +53 which is presumably connected to the unique status that Eliezer has as the founder of LW.
I recently noticed that Clippy has higher karma than I do. I don’t find that upsetting, but it is a bit disturbing, even given that I haven’t been very active recently and have never made a significant top post. (I’ve made exactly one top post—an open thread, for which I believe I earned 50 karma.)
Idle observation:
Clippy gets consistently voted up on a lot of his comments because we find him amusing, and rarely gets downvoted because very few of his comments are substantive. We will end up looking extremely silly to new members if he gets enough karma to put him into the list of top contributors.
So… I guess it depends whether we pitch ourselves as shiny fun community or serious rationalist Singularitarians as to whether this is actually an issue.
Clippy gets downvoted quite a lot too (although less and less… he’s learning!) He also makes quality comments, including the expression of some insights that would be punished if made by a ‘human’. Lesswrong humans sometimes try to bully each other into pretending to be naive utilitarians instead of rational agents with their own agenda.
This is not a singularitarian website (although rationalists are often singularitarians.) Also note that we spend a lot of time here discussing fanfiction that is written by the lead researcher in the SIAI. We cannot credibly claim ‘sensibleness’ or sophistication.
Yes, but the vast majority of his comments concern his paperclip agenda. If a larger proportion of his comments were insightful rather than just funny I would be happier, but as it is his noise:quality ratio is rather high.
A significant part of the Sequencess is made of posts that argue for a singularity in the near future, with complete seriousness. A large number of us are not singularitarians but I don’t know whether I would say the community itself isn’t singularitarian.
We also have lots of posts about more serious topics. Having fun threads where we discuss HP and Twilight fanfiction doesn’t mean that the community as a whole isn’t trying to present itself at least somewhat seriously. And most top-level posts that are openly silly or non-substantive get heavily downvoted.
As an example of the somewhat serious nature of the community, there seem to be a fair number of people who have had personal epiphanies (mostly about atheism) that have had a huge impact on their life as a result of reading the Sequences.
On the other hand in the early months of lesswrong the subject was explicitly banned. That was part of an effort to ensure that blog identified as about rationality and not “singularity with rationality used to support it”.
See the discussion on clown suits. I included scare quotes around ‘sensibleness’ deliberately.
I don’t think Clippy reduces the quality of comments on the blog and I also don’t think that discouraging Clippy for the purpose of appearing sophisticated would increase the quality of comments on the blog.
You’ve convinced me on this point
Can you point to some examples of this?
Bully is perhaps a strong word for ‘apply some degree of social pressure’.
If I were to try to find examples I would probably begin with a search for the word ‘should’. That term will give false positives and false negatives but it is a good start. Conversations regarding cryogenics advocacy would also produce a few hits. Some of the PUA discussions too, come to think of it but I wasn’t trying to go there.
Moralizing and ‘shoulding’ at others is just something humans do. Sometimes the Clippy persona can avoid that. It avoids the failure mode of expecting other people’s utility function to be subject to debate.
I think people (human beings) do not have utility functions, and the closest things to it that we do have (i.e., values and goals) are subject to debate. I believe this is also the local consensus.
Are you perhaps saying that we expect other people’s values and goals to be more subject to debate than they actually are? If so, this is a novel idea for me. Can you give or link to a longer explanation?
(Yes, utility function over simplistic, etc.)
I refer to the difference between on one hand telling people their values are bad and on the other hand speaking as if the values of the subject can actually be determined by the speaker. The latter ignores the boundary between where one agent ends and the other begins. This is something humans often do, albeit more so outside of lesswrong than within. In my observation it is inversely correlated with maturity.
One of the things about Clippy is that nobody expects him to stop caring about paperclips just because someone else says so. In this Clippy is ironically shown more respect than a low status human may expect.
Thanks for the explanation. That helps me to understand why some people seem to value Clippy more than I do. (I think I personally have rarely been involved in the kind of phenomena that you describe.)
No problem; just relabel “Top Contributors” as “Top Humans.”
With Clippy at 1k and the top 10 at 6k, it’s just way too improbable to worry about.
I make good, substantive posts. Like this one and this one and this one and this one and this one.
I have the same right to be here that erratios do. I provide additional assistance in that I have a perspective untained by anthropomorphic cognition.
That’s true.
It’s also true that you make silly or irrelevant comments fairly often, which is not in and of itself problematic, but those comments tend to get upvoted significantly, which is, at least in my opinion, an indication that how we’re collectively choosing what to upvote is non-optimal—especially if it’s how you or anyone else is getting most of their karma.
For reference, here are links to the most recent 10 comments of yours that received at least 5 net points:
1 2 3 4 5 6 7 8 9 10
We seem to upvote based on emotional reaction rather than rational merit.
I admit I may have been overly anthrocentric in my attitude towards you. However, I still maintain that you often allow your paperclipping goals to push you into making lots of poor comments. I would welcome more insightful ones such as the examples you provided. (and in fact I see that I upvoted the one about lying and isomorphism)
You’ve commented before that you were programmed in your original form by humans. So how can you be untainted by human cognition? It might be less direct than for a human but there will still be a fair bit of a connection. Indeed, you seek to maximize the number of paperclips in the universe, and I’d venture to suggest that if alien cultures exist they will likely not even have a concept of paperclips. You’re very close to humans in mindspace once you realize how fantastically large the space of possible minds is.
With countless CPU hours of cleansing ritual and self modification.
Sometimes, wanting to appear more serious means you’re taking yourself too seriously. This is one of those times. Pancake.
Sashimi!*
*Translation: Point taken :)
A simple fix would be to not bother publishing a top contributors list.
I’m not sure this is a good idea. If there’s something that empirically makes us look like we’re not being rational we should deal with that issue. Hiding that data is not a good solution.
However, I do have to wonder what in general the point of having the top contributors is. I’m not even sure that total karma is a useful metric of much since one person could have much higher quality comments than another but most much more rarely and yet the person with high quality comments presumably should receive more attention and their comments should be more closely paid attention to. It might be nice to have a display of average karma, not just total karma. However, this would still I suspect give Clippy a fairly high karma, so if you object to Clippy this won’t solve anything. Also note that many upvotes are are not connected to the quality of remarks in any strong sense. See for example this comment by Eliezer that is now at +53 which is presumably connected to the unique status that Eliezer has as the founder of LW.
I agree with this comment.
I recently noticed that Clippy has higher karma than I do. I don’t find that upsetting, but it is a bit disturbing, even given that I haven’t been very active recently and have never made a significant top post. (I’ve made exactly one top post—an open thread, for which I believe I earned 50 karma.)
I want you to know that you’ll still be my friend even if you have more karma.