The Bayesian line is that we are too certain of our own flawed reasoning faculties, and should defer more to the average opinion. After all, if everyone has similar but noisy reasoning abilities, then lots of people coming to a conclusion is a much better indicator than a single person coming to another conclusion – and there is no reason to privilege the latter just because that brain happens to be your own.
On the margin, this is probably true—most people err on the overconfident side. But you have to adjust for the fact that on many issues people don’t actually apply their full attention and come up with independent conclusions. In some areas they do: markets encourage rationality by punishing those who over- or under-estimate the value of a product, and in many areas – “how do you make friends?” or “how do you become a great boxer?” – most people with an answer have had to do their own research, reflect on their experience, and come up with independent answers. But as Razib showed, in many issues, including the most contentious ones, each person with an opinion hasn’t actually done a lot of research and critical thinking to justify it; most people subscribe to the opinions held by people that they respect.
Doing serious, first-principles research is hard and time-consuming. Gary Taubes’ investigation of diet science required a decade of effort and a mastery of biochemistry and physiology. Yes, he was fortunate to turn up enough interesting material to make a book out of it, but for most of us it’s neither realistic nor worthwhile to take the effort required to audit each of our beliefs, especially the mostly symbolic ones we use to demonstrate tribal affililation.
Which is fine, but in such areas we need to lower the amount of importance attached to the consensus view, since it represents fewer units of independent cognition. For the rookie boxer, deferring to the consensus is probably the best choice. But for many the issues, when you read “everyone believes X” it’s reasonable to substitute “a small, highly incestuous group of interested parties and political spinners believes X, and everyone else follows along.” In such fields, the truth-value of consensus is relatively low.
Also, note that “reasoning from first principles” is decidedly not equivalent to “reasoning”. The blog post’s casual confounding of the two just annoyed the crap out of me.
The Bayesian line is that we are too certain of our own flawed reasoning faculties, and should defer more to the average opinion. After all, if everyone has similar but noisy reasoning abilities, then lots of people coming to a conclusion is a much better indicator than a single person coming to another conclusion – and there is no reason to privilege the latter just because that brain happens to be your own.
On the margin, this is probably true—most people err on the overconfident side. But you have to adjust for the fact that on many issues people don’t actually apply their full attention and come up with independent conclusions. In some areas they do: markets encourage rationality by punishing those who over- or under-estimate the value of a product, and in many areas – “how do you make friends?” or “how do you become a great boxer?” – most people with an answer have had to do their own research, reflect on their experience, and come up with independent answers. But as Razib showed, in many issues, including the most contentious ones, each person with an opinion hasn’t actually done a lot of research and critical thinking to justify it; most people subscribe to the opinions held by people that they respect.
Doing serious, first-principles research is hard and time-consuming. Gary Taubes’ investigation of diet science required a decade of effort and a mastery of biochemistry and physiology. Yes, he was fortunate to turn up enough interesting material to make a book out of it, but for most of us it’s neither realistic nor worthwhile to take the effort required to audit each of our beliefs, especially the mostly symbolic ones we use to demonstrate tribal affililation.
Which is fine, but in such areas we need to lower the amount of importance attached to the consensus view, since it represents fewer units of independent cognition. For the rookie boxer, deferring to the consensus is probably the best choice. But for many the issues, when you read “everyone believes X” it’s reasonable to substitute “a small, highly incestuous group of interested parties and political spinners believes X, and everyone else follows along.” In such fields, the truth-value of consensus is relatively low.
It’s not just that they didn’t give it their full attention, but that part of their attention is focused on correlations with the opinions of others.
Also, note that “reasoning from first principles” is decidedly not equivalent to “reasoning”. The blog post’s casual confounding of the two just annoyed the crap out of me.