If I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way, and I should ignore it and stick with my prior.
That’s true, but it’s just a restatement of your ignorance of a topic. When one is sufficiently ignorant of a topic, one isn’t capable of evaluating the arguments.
But Yvain suggests that continued education left him unable to differentiate the quality of arguments. How much of that was that he was reading only nonsense. Reading competing Timecube-quality arguments on a particular topic doesn’t add to one’s understanding—but so what? That doesn’t imply that learning how to recognize good arguments is a strange quality—one can still aspire to become better at it, and reasonably expect to achieve that goal.
In short, unwillingness to take ideas seriously sounds like a terrible idea. Unwillingness to take bad ideas seriously is worthwhile, but skipping over the mechanisms for filtering good ideas from bad leaves me confused about the point of the post.
And the point of CFAR is to help people become better filtering good ideas from bad. It is plainly not to produce people who automatically believe the best verbal argument anyone presents to them without regard for what filters that argument has been through, or what incentives the Skilled Arguer might have to utter the Very Convincing Argument for X instead of the Very Very Convincing Argument for Y. And certainly not to have people ignore their instincts; e.g. CFAR constantly recommends Thinking Fast and Slow by Kahneman, and teaches exercises to extract more information from emotional and physical senses.
The point of the post is that post people, in most domains, should not trust that they are good at filtering good ideas from bad.
Or good courses from the bad courses. People should rely on empirical evidence more, that is to say, need more empiricism over rationalism. E.g. here, in rationalist community (quoting verbatim from article linked on About page): “Epistemic rationality is about forming true beliefs, about getting the map in your head to accurately reflect the territory of the world. We can measure epistemic rationality by comparing the rules of logic and probability theory to the way that a person actually updates their beliefs.”, whereas just about anyone else would measure that kind of thing by predicting something hidden, then checking for correctness, which is more empiricist than rationalist.
From the article:
That’s true, but it’s just a restatement of your ignorance of a topic. When one is sufficiently ignorant of a topic, one isn’t capable of evaluating the arguments.
But Yvain suggests that continued education left him unable to differentiate the quality of arguments. How much of that was that he was reading only nonsense. Reading competing Timecube-quality arguments on a particular topic doesn’t add to one’s understanding—but so what? That doesn’t imply that learning how to recognize good arguments is a strange quality—one can still aspire to become better at it, and reasonably expect to achieve that goal.
In short, unwillingness to take ideas seriously sounds like a terrible idea. Unwillingness to take bad ideas seriously is worthwhile, but skipping over the mechanisms for filtering good ideas from bad leaves me confused about the point of the post.
The point of the post is that most people, in most domains, should not trust that they are good at filtering good ideas from bad.
And the point of CFAR is to help people become better filtering good ideas from bad. It is plainly not to produce people who automatically believe the best verbal argument anyone presents to them without regard for what filters that argument has been through, or what incentives the Skilled Arguer might have to utter the Very Convincing Argument for X instead of the Very Very Convincing Argument for Y. And certainly not to have people ignore their instincts; e.g. CFAR constantly recommends Thinking Fast and Slow by Kahneman, and teaches exercises to extract more information from emotional and physical senses.
Or good courses from the bad courses. People should rely on empirical evidence more, that is to say, need more empiricism over rationalism. E.g. here, in rationalist community (quoting verbatim from article linked on About page): “Epistemic rationality is about forming true beliefs, about getting the map in your head to accurately reflect the territory of the world. We can measure epistemic rationality by comparing the rules of logic and probability theory to the way that a person actually updates their beliefs.”, whereas just about anyone else would measure that kind of thing by predicting something hidden, then checking for correctness, which is more empiricist than rationalist.