I think the truthseeking norms on LW are specifically useful for collective sense-making and sharing ideas about things where there isn’t already a very confident consensus. As an example, during the COVID pandemic, many of the other groups of intellectuals who were trying to figure things out had various limitations:
Some had conflicts of interest with regard to their own work.
Some were published (self- or otherwise) by businesses more interested in engaging than informing.
Some considered themselves responsible for manipulating people’s behavior, rather than purely informing them.
Some were in sociopolitical situations that constrained what views were acceptable for them to hold.
That makes it difficult to use their output, at least if you are a layperson. Those constraints are less common in well-argued LW discussions about most topics.
Yeah, “collective sense-making” feels right to me. Individual aspiring rationalists sometimes say crazy things, but the rest of the group usually corrects them when they do.
As opposed to (in my opinion typical) situations outside of Less Wrong where:
low-status people do not dare to say unexpected things
when high-status people say something, no one dares to contradict them
So either the truths do not appear, or the falsehoods do not disappear. Basically a group of normies is usually approximately as smart/sane as its highest-status member. Aspiring rationalists do better (although not perfectly) at merging individual knowledge into a smarter whole.
Thus you can have rare specialists talk about e.g. covid or crypto and have an impact on the community at large, as the community collectively evaluates it as probably correct. But this is not the same as “open-mindedness” as understood usually, because the community can also collectively reject various things; otherwise we would see here all kinds of scams and hype.
(In this context, it is worth paying special attention to various rationalist cults, but it seems to me that they all happened in the periphery of the community, at places isolated from the collective feedback. Again, some of us are individually quite insane, but we are collectively sane. Insanity prevails when a charismatic person succeeds at creating an isolated bubble of wannabe rationalists.)
I think the truthseeking norms on LW are specifically useful for collective sense-making and sharing ideas about things where there isn’t already a very confident consensus. As an example, during the COVID pandemic, many of the other groups of intellectuals who were trying to figure things out had various limitations:
Some had conflicts of interest with regard to their own work.
Some were published (self- or otherwise) by businesses more interested in engaging than informing.
Some considered themselves responsible for manipulating people’s behavior, rather than purely informing them.
Some were in sociopolitical situations that constrained what views were acceptable for them to hold.
That makes it difficult to use their output, at least if you are a layperson. Those constraints are less common in well-argued LW discussions about most topics.
Also we understand basic arithmetic around here, which goes a long way sometimes.
Yeah, “collective sense-making” feels right to me. Individual aspiring rationalists sometimes say crazy things, but the rest of the group usually corrects them when they do.
As opposed to (in my opinion typical) situations outside of Less Wrong where:
low-status people do not dare to say unexpected things
when high-status people say something, no one dares to contradict them
So either the truths do not appear, or the falsehoods do not disappear. Basically a group of normies is usually approximately as smart/sane as its highest-status member. Aspiring rationalists do better (although not perfectly) at merging individual knowledge into a smarter whole.
Thus you can have rare specialists talk about e.g. covid or crypto and have an impact on the community at large, as the community collectively evaluates it as probably correct. But this is not the same as “open-mindedness” as understood usually, because the community can also collectively reject various things; otherwise we would see here all kinds of scams and hype.
(In this context, it is worth paying special attention to various rationalist cults, but it seems to me that they all happened in the periphery of the community, at places isolated from the collective feedback. Again, some of us are individually quite insane, but we are collectively sane. Insanity prevails when a charismatic person succeeds at creating an isolated bubble of wannabe rationalists.)