I’ve gotten the impression that many people are proud of their ignorance because they feel that the folks who know a field better can’t see the forest from the trees and only use their knowledge to construct elaborate rationalizations. It’s somewhat similar to the way that an atheist might say “no, I don’t know anything about theology’s answer to the Problem of Evil and I don’t care, they’re wrong regardless”.
(I must admit that I’ve sometimes thought like this myself, about some topics.)
It’s somewhat similar to the way that an atheist might say “no, I don’t know anything about theology’s answer to the Problem of Evil and I don’t care, they’re wrong regardless”.
That’s roughly my position vs. the Problem of Evil; I don’t care what the theologians say; but then, I don’t use the Problem of Evil to argue against religion, religions would be wrong even if they had the perfect answer to that specific argument (same goes for other weak arguments like the historicity of Jesus).
You realize that such epistemic habits would never lead you to truth if you were indoctrinated with a different prior? “I don’t care what the atheists say about he Problem of Evil, they would be wrong for a host of other reasons even if the theologians’ arguments suck.” In general your epistemic habits should be meta enough such that you wouldn’t immediately be screwed if your prior were different. [ETA: Deleted needlessly snarky sentences. Apologies.] [ETA2: This is certainly an uncharitable caricature of your actual position; I wrote it the way I did to shine light on a basic point of epistemology. In general this comment kinda sucks and shouldn’t be taken too seriously.]
You realize that such epistemic habits would never lead you to truth if you were indoctrinated with a different prior? “I don’t care what the atheists say about he Problem of Evil, they would be wrong for a host of other reasons even if the theologians’ arguments suck.”
Then in that case it’s not just a different prior I’d need, it’s the “host of other reasons”, and more specifically, strong reasons among those.
I agree that there’s an important distinction between “argument A can be ignored because there are much stronger arguments for the same position” (roughly my position) and “argument A can ignored because there are many more arguments for the same position(“arguments are soldiers”, leads to the problem you describe).
My purpose is to mostly avoid long debates that won’t change either side’s mind anyway even if one convinces the other on that particular issue (like “does Barack Obama pick his nose?”).
I’ve gotten the impression that many people are proud of their ignorance because they feel that the folks who know a field better can’t see the forest from the trees and only use their knowledge to construct elaborate rationalizations.
Well said—that’s about where I am with respect to mainstream macroeconomics, in that the models, metrics, etc. they use are hopelessly disconnected from the issue of “How would this policy allow people to get more of what they actually want?” But I’m not sure if I count as being proud of this ignorance: I’d certainly like to know more, so I can get the best understanding of where (I currently suspect) they go astray.
It’s equivalent to the hypothesis that seeing more evidence is often bad, which is a theme emphasized on LW and in some Abrahamic sects (but only the LW/religious devout take it particularly seriously). Does it show up among normal people as well? In my experience normal people tend to assume that if someone’s studied something for a while then that person’s opinions are probably worth taking seriously.
I’ve gotten the impression that many people are proud of their ignorance because they feel that the folks who know a field better can’t see the forest from the trees and only use their knowledge to construct elaborate rationalizations. It’s somewhat similar to the way that an atheist might say “no, I don’t know anything about theology’s answer to the Problem of Evil and I don’t care, they’re wrong regardless”.
(I must admit that I’ve sometimes thought like this myself, about some topics.)
That’s roughly my position vs. the Problem of Evil; I don’t care what the theologians say; but then, I don’t use the Problem of Evil to argue against religion, religions would be wrong even if they had the perfect answer to that specific argument (same goes for other weak arguments like the historicity of Jesus).
You realize that such epistemic habits would never lead you to truth if you were indoctrinated with a different prior? “I don’t care what the atheists say about he Problem of Evil, they would be wrong for a host of other reasons even if the theologians’ arguments suck.” In general your epistemic habits should be meta enough such that you wouldn’t immediately be screwed if your prior were different. [ETA: Deleted needlessly snarky sentences. Apologies.] [ETA2: This is certainly an uncharitable caricature of your actual position; I wrote it the way I did to shine light on a basic point of epistemology. In general this comment kinda sucks and shouldn’t be taken too seriously.]
Then in that case it’s not just a different prior I’d need, it’s the “host of other reasons”, and more specifically, strong reasons among those.
I agree that there’s an important distinction between “argument A can be ignored because there are much stronger arguments for the same position” (roughly my position) and “argument A can ignored because there are many more arguments for the same position(“arguments are soldiers”, leads to the problem you describe).
My purpose is to mostly avoid long debates that won’t change either side’s mind anyway even if one convinces the other on that particular issue (like “does Barack Obama pick his nose?”).
Well said—that’s about where I am with respect to mainstream macroeconomics, in that the models, metrics, etc. they use are hopelessly disconnected from the issue of “How would this policy allow people to get more of what they actually want?” But I’m not sure if I count as being proud of this ignorance: I’d certainly like to know more, so I can get the best understanding of where (I currently suspect) they go astray.
It’s equivalent to the hypothesis that seeing more evidence is often bad, which is a theme emphasized on LW and in some Abrahamic sects (but only the LW/religious devout take it particularly seriously). Does it show up among normal people as well? In my experience normal people tend to assume that if someone’s studied something for a while then that person’s opinions are probably worth taking seriously.