Most important matters have a large political component. If it’s not political, it’s probably either not important or highly neglected (and as soon as it’s not neglected, it probably gets politicized). Moreover, if I would classify a document as reliable in a non-political context, that same document, written by the same people, suddenly becomes harder to evaluate if it was produced in a politicized context. For instance, consider this is a presentation by a virologist. Ordinarily I would consider a video to be quite reliable if it’s an expert making a seemingly strong case to other experts, but it was produced in a politicized environment and that makes it harder to be sure I can trust it. Maybe, say, the presenter is annoyed about non-experts flooding in to criticize him or his field, so he’s feeling more defensive and wants to prove them wrong. (On the other hand, increased scrutiny can improve the quality of scientific work. It’s hard to be sure. Also, the video had about 250 views when I saw it and 576 views a year later—it was meant for an expert audience, directed to an expert audience, and never went anywhere close to viral, so he may be less guarded in this context than when he is talking to a journalist or something.)
My goal here is not to solve the problem of “making science work better” or “keeping trivia databases honest”. I want to make the truth easier to find in a political environment that has immense groups of people who are arriving at false or true beliefs via questionable reasoning and cherry-picked evidence, and where expertise is censored by glut. This tends to be the kind of environment where the importance and difficulty (for non-experts) of getting the right answer both go up at once. Where once a Google search would have taken you to some obscure blogs and papers by experts discussing the evidence evenhandedly (albeit in frustratingly obscurantist language), politicization causes the same search to give you page after page of mainstream media and bland explanations which gravitate to some narrative or other and which rarely provide strong clues of reliability.
I would describe my personal truthseeking as frustrating. It’s hard to tell what’s true on a variety of important matters, and even the ones that seemed easy often aren’t so easy when you dive into it. Examples:
I mentioned before my frustration trying to learn about radiation risks.
I’ve followed the Ukraine invasion closely since it started. It’s been extremely hard to find good information, to the point where I use quantity as a substitute for quality because I don’t know a better way. This is wastefully time-consuming and if I ever manage to reach a firm conclusion about a subtopic of the war, I have nowhere to publish my findings that any significant number of people would read (I often publish very short summaries or links to what I think is good information on Twitter, knowing that publishing in more detail would be pointless given my lack of audience; I also sometimes comment on Metaculus about war-related topics, but only when my judgement pertains specifically to a forecast that Metaculus happens to ask about.) The general problem I have in this area is a combination of (1) almost nobody citing their sources, (2) the sources themselves often being remarkably barren, e.g. the world-famous Oryx loss data [1, 2] gives nowhere near enough information to tell whether an asserted Russian loss is actually a Russian rather than Ukrainian loss, (3) Russia and Ukraine both have strong information operations that create constant noise, (4) I find pro-Putin sources annoying because of their bloodthirstiness, ultranationalism and authoritarianism, so while some of them give good evidence, I am less likely to discover them, follow them and see that evidence.
It appears there’s a “97% consensus on global warming”, but when you delve deep into it, it’s not as clear-cut. Sorry to toot my own horn, but I haven’t seen any analysis of the consensus numbers as detailed and evenhanded as the one I wrote at that link (though I have a bias toward the consensus position). That’s probably not because no one else has done such an analysis, but because an analysis like that (written by a rando and not quite affirming either of the popular narratives) tends not to surface in Google searches. Plus, my analysis is not updated as new evidence comes in, because I’m no longer following the topic.
I saw a rather persuasive full-length YouTube ‘documentary’ with holocaust-skepticism. I looked for counterarguments, but those were relatively hard to find among the many pages saying something like “they only believe that because they are hateful and antisemitic” (the video didn’t display any hint of hate or antisemitism that I could see). When I did find the counterarguments, they were interlaced with strong ad-hominim attacks against the people making the arguments, which struck me as unnecessarily inflammatory rather than persuasive.
I was LDS for 27 years before discovering that my religion was false, despite always being open to that possibility. For starters, I didn’t realize the extent to which I lived in a bubble or to which I and (especially) other members had poor epistemology. But even outside the bubble it just wasn’t very likely that I would stumble upon someone who would point me to the evidence that it was false.
is it only the other people who are not good at collecting and organizing evidence?
No, I don’t think I’m especially good at it, and I often wonder if certain other smart people have a better system. I wish I had better tooling and I want this tool for myself as much as anyone else.
Not a good sign
in what way? Are you suggesting that if I built this web site, it would not in fact use algorithms designed in good faith with epistemological principles meant to elevate ideas that are more likely to be true but, rather, it would look for terms like “global warming” and somehow tip the scales toward “humans cause it”?
A lot of the resources invested into “fighting misinformation” is about censoring nonestablishment voices and that often includes putting out misinformation like “Hunter’s laptop was a Russian misinformation campaign” to facilitate political censorship.
In that enviroment, someone who proposing a new truthseeking project might also be interested into treating a project to strengthen the ruling narrative or they might be interested in actual truthseeking that affirms the ruling narrative when it’s right and challenges it if it’s wrong.
In a world where there’s so much political pressure it probably takes strong conviction to have a project that does actual truthseeking instead of being coopted for narrative control.
Most important matters have a large political component. If it’s not political, it’s probably either not important or highly neglected (and as soon as it’s not neglected, it probably gets politicized). Moreover, if I would classify a document as reliable in a non-political context, that same document, written by the same people, suddenly becomes harder to evaluate if it was produced in a politicized context. For instance, consider this is a presentation by a virologist. Ordinarily I would consider a video to be quite reliable if it’s an expert making a seemingly strong case to other experts, but it was produced in a politicized environment and that makes it harder to be sure I can trust it. Maybe, say, the presenter is annoyed about non-experts flooding in to criticize him or his field, so he’s feeling more defensive and wants to prove them wrong. (On the other hand, increased scrutiny can improve the quality of scientific work. It’s hard to be sure. Also, the video had about 250 views when I saw it and 576 views a year later—it was meant for an expert audience, directed to an expert audience, and never went anywhere close to viral, so he may be less guarded in this context than when he is talking to a journalist or something.)
My goal here is not to solve the problem of “making science work better” or “keeping trivia databases honest”. I want to make the truth easier to find in a political environment that has immense groups of people who are arriving at false or true beliefs via questionable reasoning and cherry-picked evidence, and where expertise is censored by glut. This tends to be the kind of environment where the importance and difficulty (for non-experts) of getting the right answer both go up at once. Where once a Google search would have taken you to some obscure blogs and papers by experts discussing the evidence evenhandedly (albeit in frustratingly obscurantist language), politicization causes the same search to give you page after page of mainstream media and bland explanations which gravitate to some narrative or other and which rarely provide strong clues of reliability.
I would describe my personal truthseeking as frustrating. It’s hard to tell what’s true on a variety of important matters, and even the ones that seemed easy often aren’t so easy when you dive into it. Examples:
I mentioned before my frustration trying to learn about radiation risks.
I’ve followed the Ukraine invasion closely since it started. It’s been extremely hard to find good information, to the point where I use quantity as a substitute for quality because I don’t know a better way. This is wastefully time-consuming and if I ever manage to reach a firm conclusion about a subtopic of the war, I have nowhere to publish my findings that any significant number of people would read (I often publish very short summaries or links to what I think is good information on Twitter, knowing that publishing in more detail would be pointless given my lack of audience; I also sometimes comment on Metaculus about war-related topics, but only when my judgement pertains specifically to a forecast that Metaculus happens to ask about.) The general problem I have in this area is a combination of (1) almost nobody citing their sources, (2) the sources themselves often being remarkably barren, e.g. the world-famous Oryx loss data [1, 2] gives nowhere near enough information to tell whether an asserted Russian loss is actually a Russian rather than Ukrainian loss, (3) Russia and Ukraine both have strong information operations that create constant noise, (4) I find pro-Putin sources annoying because of their bloodthirstiness, ultranationalism and authoritarianism, so while some of them give good evidence, I am less likely to discover them, follow them and see that evidence.
It appears there’s a “97% consensus on global warming”, but when you delve deep into it, it’s not as clear-cut. Sorry to toot my own horn, but I haven’t seen any analysis of the consensus numbers as detailed and evenhanded as the one I wrote at that link (though I have a bias toward the consensus position). That’s probably not because no one else has done such an analysis, but because an analysis like that (written by a rando and not quite affirming either of the popular narratives) tends not to surface in Google searches. Plus, my analysis is not updated as new evidence comes in, because I’m no longer following the topic.
I saw a rather persuasive full-length YouTube ‘documentary’ with holocaust-skepticism. I looked for counterarguments, but those were relatively hard to find among the many pages saying something like “they only believe that because they are hateful and antisemitic” (the video didn’t display any hint of hate or antisemitism that I could see). When I did find the counterarguments, they were interlaced with strong ad-hominim attacks against the people making the arguments, which struck me as unnecessarily inflammatory rather than persuasive.
I was LDS for 27 years before discovering that my religion was false, despite always being open to that possibility. For starters, I didn’t realize the extent to which I lived in a bubble or to which I and (especially) other members had poor epistemology. But even outside the bubble it just wasn’t very likely that I would stumble upon someone who would point me to the evidence that it was false.
No, I don’t think I’m especially good at it, and I often wonder if certain other smart people have a better system. I wish I had better tooling and I want this tool for myself as much as anyone else.
in what way? Are you suggesting that if I built this web site, it would not in fact use algorithms designed in good faith with epistemological principles meant to elevate ideas that are more likely to be true but, rather, it would look for terms like “global warming” and somehow tip the scales toward “humans cause it”?
Please be specific.
A lot of the resources invested into “fighting misinformation” is about censoring nonestablishment voices and that often includes putting out misinformation like “Hunter’s laptop was a Russian misinformation campaign” to facilitate political censorship.
In that enviroment, someone who proposing a new truthseeking project might also be interested into treating a project to strengthen the ruling narrative or they might be interested in actual truthseeking that affirms the ruling narrative when it’s right and challenges it if it’s wrong.
In a world where there’s so much political pressure it probably takes strong conviction to have a project that does actual truthseeking instead of being coopted for narrative control.