I’m afraid “Good at presenting their ideas in a persuasive manner” is doing all the heavy lifting here.
If the community had a good impression of him, they’d value his research over that of a PhD. If the community had a bad impression of him, they’d not give a second of thought towards his “research” and they would refer to it with the same mocking quotation marks that I just used. However, in the latter case, they’d find it more difficult to dismiss his PhD.
In other words, the interpretation depends if the community likes you or not. I’ve been in other rationalist communities and I’m speaking from experience (if I’m less vague that this, I’d be recognizable, which I don’t want to be). I saw all the negative social dynamics that you’d find on Reddit or in young female friend groups with a lot of “drama” going on, in case you’re unfortunate enough to have an intuition for such a thing.
In any “normie” community there’s the staff in charge, and a large number of regular users who are somewhat above the law, and who feel superior to new users (and can bully them all they want, as they’re friends with the staff). The treatment of users users depend on how well they fit in culturally, and it requires that they act as if the regulars are special (otherwise their ego is hurt). Of course, some of these effects are borderline invisible on this website, so they’re either well-hidden or kept in check.
Still, this is not a truth-maximizing website, the social dynamics and their false premises (e.g. the belief that popularity is a measure of quality) are just too strong. The sort of intllectuals who don’t care about social norms, status or money are better at truth-seeking and generally received poorly by places like this.
I mean, I agree with this, but popularity has a better correlation with truth here compared with any other website—or more broadly, social group—that I know of. And actually, I think it’s probably not possible for a relatively open venue like this to be perfectly truth-seeking. To go further in that direction, I think you ultimately need some sort of institutional design to explicitly reward accuracy, like prediction markets. But the ways in which LW differs from pure truth-and-importance-seeking don’t strike me as entirely bad things either—posts which are inspiring or funny get upvoted more, for instance. I think it would be difficult to nucleate a community focused on truth-seeking without “emotional energy” of this sort.
I don’t think it’s possible without changing the people into weird types who really don’t care too much about the social aspects of life because they’re so interested in the topics at hand. You can try rewarding truth, but people still stumble into issues regarding morality, popularity of ideas, the overton window, some political group that they dislike randomly hitting upon the truth so that they look like supporters for stating the same thing, etc.
I think prediction markets are an interesting concept, but it cannot be taken much further than it is now, since the predictions could start influencing the outcomes. It’s dangerous to add rewards to the outcomes of predictions, for when enough money is involved, one can influence the outcome.
The way humans in general differ from truth-seeking agents makes their performance downright horrible on some specific areas (if the truth is not in the overton window for instance). These inaccuracies can cascade and cause problems elsewhere, since they cause incorrect worldviews even in somewhat intelligent people like Musk. There’s also a lot of information which is simply getting deleted from the internet, and you can’t “weight both sides of the argument” if half the argument is only visible on the waybackmachine or archive.md.
I guess it’s important to create a good atmosphere and that everyone is having fun theorizing and such, but some of the topics we’re discussing are actually serious. The well-being of millions of people depend on the sort of answers and perspectives which float around puvlic discourse, and I find it pathetic that ideas are immediately shut down if they’re not worded correctly or if they touch a growing list of socially forbidden hypotheses.
Finally, these alternative rewards have completely destroyed almost all voting systems on the internet. There’s almost no website left on which the karma/thumb/upvote/like count bears any resemblence to post quality anymore. Instead, it’s a linear combination of superstimuli like ‘relatability’, ‘novelty’, ‘feeling of importance (e.g. bad news, danger)’, ‘cuteness’, ‘escapism’, ‘sexual fantasy’, ‘romantic fantasy’, ‘boo outgroup’, ‘irony/parody/parody of parody/self-parady/nihilism’, ‘nostalgia’, ‘stupidity’ (I’m told it’s a kind of humor if you’re stupid on purpose, but I think “irony” is a defence mechanism against social judgement). It’s like a view into the unfulfilled needs of the population. Youtube view count and subscriptions, Reddit karma, Twitter retweets, all almost gamed to the point that they’re useless metrics. Online review sites are going in the same direction. It’s like interacting with a group of mentally ill people who decide what you’re paid each day. I think it’s dangerous to upvote comments based on vibes as it takes very little to corrupt these metrics, and it’s hard to notice if upvotes gradually come to represent “dopamine released by reading” or something other than quality/truthfulness.
I’m afraid “Good at presenting their ideas in a persuasive manner” is doing all the heavy lifting here.
If the community had a good impression of him, they’d value his research over that of a PhD. If the community had a bad impression of him, they’d not give a second of thought towards his “research” and they would refer to it with the same mocking quotation marks that I just used. However, in the latter case, they’d find it more difficult to dismiss his PhD.
In other words, the interpretation depends if the community likes you or not. I’ve been in other rationalist communities and I’m speaking from experience (if I’m less vague that this, I’d be recognizable, which I don’t want to be). I saw all the negative social dynamics that you’d find on Reddit or in young female friend groups with a lot of “drama” going on, in case you’re unfortunate enough to have an intuition for such a thing.
In any “normie” community there’s the staff in charge, and a large number of regular users who are somewhat above the law, and who feel superior to new users (and can bully them all they want, as they’re friends with the staff). The treatment of users users depend on how well they fit in culturally, and it requires that they act as if the regulars are special (otherwise their ego is hurt). Of course, some of these effects are borderline invisible on this website, so they’re either well-hidden or kept in check.
Still, this is not a truth-maximizing website, the social dynamics and their false premises (e.g. the belief that popularity is a measure of quality) are just too strong. The sort of intllectuals who don’t care about social norms, status or money are better at truth-seeking and generally received poorly by places like this.
I mean, I agree with this, but popularity has a better correlation with truth here compared with any other website—or more broadly, social group—that I know of. And actually, I think it’s probably not possible for a relatively open venue like this to be perfectly truth-seeking. To go further in that direction, I think you ultimately need some sort of institutional design to explicitly reward accuracy, like prediction markets. But the ways in which LW differs from pure truth-and-importance-seeking don’t strike me as entirely bad things either—posts which are inspiring or funny get upvoted more, for instance. I think it would be difficult to nucleate a community focused on truth-seeking without “emotional energy” of this sort.
I don’t think it’s possible without changing the people into weird types who really don’t care too much about the social aspects of life because they’re so interested in the topics at hand. You can try rewarding truth, but people still stumble into issues regarding morality, popularity of ideas, the overton window, some political group that they dislike randomly hitting upon the truth so that they look like supporters for stating the same thing, etc.
I think prediction markets are an interesting concept, but it cannot be taken much further than it is now, since the predictions could start influencing the outcomes. It’s dangerous to add rewards to the outcomes of predictions, for when enough money is involved, one can influence the outcome.
The way humans in general differ from truth-seeking agents makes their performance downright horrible on some specific areas (if the truth is not in the overton window for instance). These inaccuracies can cascade and cause problems elsewhere, since they cause incorrect worldviews even in somewhat intelligent people like Musk. There’s also a lot of information which is simply getting deleted from the internet, and you can’t “weight both sides of the argument” if half the argument is only visible on the waybackmachine or archive.md.
I guess it’s important to create a good atmosphere and that everyone is having fun theorizing and such, but some of the topics we’re discussing are actually serious. The well-being of millions of people depend on the sort of answers and perspectives which float around puvlic discourse, and I find it pathetic that ideas are immediately shut down if they’re not worded correctly or if they touch a growing list of socially forbidden hypotheses.
Finally, these alternative rewards have completely destroyed almost all voting systems on the internet. There’s almost no website left on which the karma/thumb/upvote/like count bears any resemblence to post quality anymore. Instead, it’s a linear combination of superstimuli like ‘relatability’, ‘novelty’, ‘feeling of importance (e.g. bad news, danger)’, ‘cuteness’, ‘escapism’, ‘sexual fantasy’, ‘romantic fantasy’, ‘boo outgroup’, ‘irony/parody/parody of parody/self-parady/nihilism’, ‘nostalgia’, ‘stupidity’ (I’m told it’s a kind of humor if you’re stupid on purpose, but I think “irony” is a defence mechanism against social judgement). It’s like a view into the unfulfilled needs of the population. Youtube view count and subscriptions, Reddit karma, Twitter retweets, all almost gamed to the point that they’re useless metrics. Online review sites are going in the same direction. It’s like interacting with a group of mentally ill people who decide what you’re paid each day. I think it’s dangerous to upvote comments based on vibes as it takes very little to corrupt these metrics, and it’s hard to notice if upvotes gradually come to represent “dopamine released by reading” or something other than quality/truthfulness.