Making a comment on solutions to the epistemic problems, in that I agree with these solutions:
Partial solution from academia: there are norms restricting people’s (influential) opinions to their domain of expertise. This creates a filter where the opinions you care about are much more likely to be the result of deep engagement with details on a given topic, and so are more likely to be correct. (Relatedly, my biggest critique of individual LW epistemics is a lack of respect for how much details matter.)
For curation in particular: get some “optimists” to feed into curation decisions. (Buck, Ryan, and Lukas all seem like potential candidates, seeing as they aren’t as pessimistic as me and at least Buck + Ryan already put some effort into LW group epistemics.)
But massively disagree with this solution:
Partial solution from academia: procedural norms around what evidence you have to show for something to become “accepted knowledge” (typically enforced via peer review).[5]
Making a comment on solutions to the epistemic problems, in that I agree with these solutions:
But massively disagree with this solution:
My general issue here is that peer review doesn’t work nearly as well as people think it does for catching problems, and in particular I think that science is advanced much more by the best theories gaining into prominence rather than suppressing the worst theories, and problems with bad theories taking up too much space are much better addressed at the funding level than the theory level.