Hm. Let me write a short “Defense of Mount Stupid” (I love the term, by the way; great fan of SMBC, too). Or, to be more accurate, I’m going to muse around the topic, in hope this shakes out some other ideas and thoughts on the subject.
I spent quite a bit of my late teens/early twenties yelling at people from the top of Mount Stupid. As Nornagest describes, internet debates changed everything: once the words were permanently written, I couldn’t bullshit as effectively, nor could I easily change the subject if some bit of bullshit gets caught. I found my baseless assertions and half-assed pontifications challenged by people who had better knowledge of the subject and better understanding of the data. It took about two or three spectacular shootdowns, but I eventually learned my lesson. Permanent record is a good thing, and these days I generally refuse to enter into content-intensive verbal debates whenever possible. Let’s sit down, sonny, and write our arguments down.
However, I eventually also found my fear of Mt. Stupid quite limiting. One can’t become an expert in every area, or sift through the data in all fields that are interesting to you or relevant to one’s life. There isn’t a real option to ignore those areas either.
Take, for instance, economics. My knowledge is very limited: a few books, a class or two, and reading some blogs from time to time. If I’m honest about it, I have to admit that I will never become an expert in it either: it is highly improbable I will ever have time or motivation to study the area in great depth. And yet I do have an opinion on economics. The current economic policies will have a profound effect on my future life; I have to try and guess what the effects of those policies are, at least in terms of trying to avoid catastrophic outcomes. And I have to choose political candidates, who are pushing economic reforms based (as far as I can see) on understanding of economics even shallower than mine.
Now, it is possible, even likely, that I’m just indulging an illusion. Even professional economists can’t really predict future effects of current policies in much detail; I probably shouldn’t even be trying. But then, my choices for setting up my economic future become almost a toss of a coin, or have to be based on blind faith into whatever economic adviser I end up with (and how do I choose one in the first place? toss of the coin again?). I am more-or-less forced to believe that my limited knowledge has some value, even if that value is very low.
Fine, let’s say this is ok—have an opinion, if you must. The point is, don’t opine—if you really understand how limited your knowledge is, don’t go out there talking about what you think.
What happens if I just stay quiet? In that case, I will be the only one who has access to my own opinion, vastly increasing the power of my confirmation bias. Sure, I can try to intentionally read opinions of people I disagree with, but that is unlikely to happen: I will rarely be able to summon motivation to read things I believe to be simply wrong. If I do manage to make myself read, I will likely skim and skip, and end up reinforcing my opinion by missing the entire point. Even worse, I don’t know what I’m looking for. If there is some bit of economic information that is very important for some aspect of my economic thinking, I probably won’t ever find it, or won’t realize what it is if I stumble upon it. My understanding of the structure of economics just isn’t good enough.
If I do opine, if I risk climbing Mt. Stupid, my opinions will be challenged. This will motivate me to do research and to check both my and my opponent’s data, forcing me to keep learning and to keep re-evaluating my position. Others, whether they agree or disagree with me, will come up with things I have never thought of. The opponents won’t be writing general opinions (which I can think around, or dismiss as irrelevant due to some particular point or technicality not mentioned by their book or article), they will be trying to undermine my specific position with precisely aimed arguments. If I escape an argument by invoking some detail, they can adapt and expand their argument to block off the escape route. Debate can thus be a cognitive tool that motivates learning and promotes growth.
If the above reasoning holds, it follows that Mt. Stupid should not always be avoided. If I remember the limits of my knowledge, if I try to resist my biases as much as that is possible, it can provide a way for growth in areas that would otherwise remain static.
My usual experience is that when I express my current beliefs as my current beliefs, with some nod to why I hold them, and if I actually pay attention to counterarguments and update my beliefs when those counterarguments are convincing, the end result is that I don’t end up climbing Mount Stupid… I’m simply wrong, which happens all the time.
An entirely valid point, in the narrow definition of Mount Stupid. I used it more broadly to mean “I’m not only holding an opinion about a topic in which I’m not an expert, I’m also in a situation where I have to express that opinion in public.” In this case, Mt. Stupid covers your approach (which is the reasonable approach I also use, and which is the closest we can get to rationality). The point of the above was to provoke other people’s thoughts on the general approach of what to do when you must have a non-expert opinion.
I very broadly agree with you. And I think it would be helpful to discover how one finds oneself on Mount Stupid and how to properly descend. This is all only from my own personal experience and so I well may be on Mount Stupid opining about Mount Stupid. But from whatever hill, mountain and peak I rest on, these have been my observations so far:
Some popularizer will write a book, essay, or regular column and explain in broad strokes a mixture of his opinions, the hard facts and the scholarly consensus of what those facts mean. To a layman that mixture of tonic, toxic and placebo is impossible to sort out. Further, since the popularizer cannot go too many inferential steps ahead of the reader without needed to split a book into multiple volumes, explainers commonly resort to lies to children. Instead of having a healthy respect for the known unknowns and difficulty of a given subject, they feel free to shout from Mount Stupid’s peak. After all, it is not they who suffer any unfortunate avalanches as they yell at the top of their lungs half remembered quotes from a journalist’s attempt to explain experimental physics. Those on mount stupid did not climb to get there. Somebody build a ski slope of lies to children, narratives and metaphors that lead them up and up until they acquired a degree of unwarranted confidence and a reverence to the bearer of good info. The difficulty of Mount Stupid rests in the fact that you were taken there without the proper tools to get down and where you could go to get those tools is usually unclear or at least costly in terms of time and effort.
How accurate does this judgment seem to your own knowledge of Mount Stupid, and further, what tools other than having your gross ignorance exposed have led you downhill toward expertise and humility?
How accurate does this judgment seem to your own knowledge of Mount Stupid, and further, what tools other than having your gross ignorance exposed have led you downhill toward expertise and humility?
I mostly agree with your analysis above; I would add, however, one very internal factor. People who do not possess significant expertise in a complex area almost always tend to underestimate the complexity of all complex systems. Even if they read on complexity, they rarely get an intuitive feel for it. So, reading a few popular books doesn’t just introduce the problems you have stated above. Since there is no intuitive understanding just how complicated things are, a person feels that the few information they have gleaned are sufficient to make an informed opinion on a topic. IMHO, this general problem also stands behind the popularity of many simplistic (and ultimately destructive) ideologies based on simplistic approaches to complex systems (such as, say, communism or libertarianism).
Along those lines, a thing that helped me a lot in this regard was becoming an expert in a complex field. Seeing how very intelligent people form deeply wrong opinions about things I understand made me very, very aware of similar biases in my thinking about other fields. It didn’t cure me from forming such opinions, but it does force me to reexamine them aggressively as soon as I realize their existence within my mind.
Hm. Let me write a short “Defense of Mount Stupid” (I love the term, by the way; great fan of SMBC, too). Or, to be more accurate, I’m going to muse around the topic, in hope this shakes out some other ideas and thoughts on the subject.
I spent quite a bit of my late teens/early twenties yelling at people from the top of Mount Stupid. As Nornagest describes, internet debates changed everything: once the words were permanently written, I couldn’t bullshit as effectively, nor could I easily change the subject if some bit of bullshit gets caught. I found my baseless assertions and half-assed pontifications challenged by people who had better knowledge of the subject and better understanding of the data. It took about two or three spectacular shootdowns, but I eventually learned my lesson. Permanent record is a good thing, and these days I generally refuse to enter into content-intensive verbal debates whenever possible. Let’s sit down, sonny, and write our arguments down.
However, I eventually also found my fear of Mt. Stupid quite limiting. One can’t become an expert in every area, or sift through the data in all fields that are interesting to you or relevant to one’s life. There isn’t a real option to ignore those areas either.
Take, for instance, economics. My knowledge is very limited: a few books, a class or two, and reading some blogs from time to time. If I’m honest about it, I have to admit that I will never become an expert in it either: it is highly improbable I will ever have time or motivation to study the area in great depth. And yet I do have an opinion on economics. The current economic policies will have a profound effect on my future life; I have to try and guess what the effects of those policies are, at least in terms of trying to avoid catastrophic outcomes. And I have to choose political candidates, who are pushing economic reforms based (as far as I can see) on understanding of economics even shallower than mine.
Now, it is possible, even likely, that I’m just indulging an illusion. Even professional economists can’t really predict future effects of current policies in much detail; I probably shouldn’t even be trying. But then, my choices for setting up my economic future become almost a toss of a coin, or have to be based on blind faith into whatever economic adviser I end up with (and how do I choose one in the first place? toss of the coin again?). I am more-or-less forced to believe that my limited knowledge has some value, even if that value is very low.
Fine, let’s say this is ok—have an opinion, if you must. The point is, don’t opine—if you really understand how limited your knowledge is, don’t go out there talking about what you think.
What happens if I just stay quiet? In that case, I will be the only one who has access to my own opinion, vastly increasing the power of my confirmation bias. Sure, I can try to intentionally read opinions of people I disagree with, but that is unlikely to happen: I will rarely be able to summon motivation to read things I believe to be simply wrong. If I do manage to make myself read, I will likely skim and skip, and end up reinforcing my opinion by missing the entire point. Even worse, I don’t know what I’m looking for. If there is some bit of economic information that is very important for some aspect of my economic thinking, I probably won’t ever find it, or won’t realize what it is if I stumble upon it. My understanding of the structure of economics just isn’t good enough.
If I do opine, if I risk climbing Mt. Stupid, my opinions will be challenged. This will motivate me to do research and to check both my and my opponent’s data, forcing me to keep learning and to keep re-evaluating my position. Others, whether they agree or disagree with me, will come up with things I have never thought of. The opponents won’t be writing general opinions (which I can think around, or dismiss as irrelevant due to some particular point or technicality not mentioned by their book or article), they will be trying to undermine my specific position with precisely aimed arguments. If I escape an argument by invoking some detail, they can adapt and expand their argument to block off the escape route. Debate can thus be a cognitive tool that motivates learning and promotes growth.
If the above reasoning holds, it follows that Mt. Stupid should not always be avoided. If I remember the limits of my knowledge, if I try to resist my biases as much as that is possible, it can provide a way for growth in areas that would otherwise remain static.
Thoughts?
My usual experience is that when I express my current beliefs as my current beliefs, with some nod to why I hold them, and if I actually pay attention to counterarguments and update my beliefs when those counterarguments are convincing, the end result is that I don’t end up climbing Mount Stupid… I’m simply wrong, which happens all the time.
An entirely valid point, in the narrow definition of Mount Stupid. I used it more broadly to mean “I’m not only holding an opinion about a topic in which I’m not an expert, I’m also in a situation where I have to express that opinion in public.” In this case, Mt. Stupid covers your approach (which is the reasonable approach I also use, and which is the closest we can get to rationality). The point of the above was to provoke other people’s thoughts on the general approach of what to do when you must have a non-expert opinion.
I very broadly agree with you. And I think it would be helpful to discover how one finds oneself on Mount Stupid and how to properly descend. This is all only from my own personal experience and so I well may be on Mount Stupid opining about Mount Stupid. But from whatever hill, mountain and peak I rest on, these have been my observations so far:
Some popularizer will write a book, essay, or regular column and explain in broad strokes a mixture of his opinions, the hard facts and the scholarly consensus of what those facts mean. To a layman that mixture of tonic, toxic and placebo is impossible to sort out. Further, since the popularizer cannot go too many inferential steps ahead of the reader without needed to split a book into multiple volumes, explainers commonly resort to lies to children. Instead of having a healthy respect for the known unknowns and difficulty of a given subject, they feel free to shout from Mount Stupid’s peak. After all, it is not they who suffer any unfortunate avalanches as they yell at the top of their lungs half remembered quotes from a journalist’s attempt to explain experimental physics. Those on mount stupid did not climb to get there. Somebody build a ski slope of lies to children, narratives and metaphors that lead them up and up until they acquired a degree of unwarranted confidence and a reverence to the bearer of good info. The difficulty of Mount Stupid rests in the fact that you were taken there without the proper tools to get down and where you could go to get those tools is usually unclear or at least costly in terms of time and effort.
How accurate does this judgment seem to your own knowledge of Mount Stupid, and further, what tools other than having your gross ignorance exposed have led you downhill toward expertise and humility?
I mostly agree with your analysis above; I would add, however, one very internal factor. People who do not possess significant expertise in a complex area almost always tend to underestimate the complexity of all complex systems. Even if they read on complexity, they rarely get an intuitive feel for it. So, reading a few popular books doesn’t just introduce the problems you have stated above. Since there is no intuitive understanding just how complicated things are, a person feels that the few information they have gleaned are sufficient to make an informed opinion on a topic. IMHO, this general problem also stands behind the popularity of many simplistic (and ultimately destructive) ideologies based on simplistic approaches to complex systems (such as, say, communism or libertarianism).
Along those lines, a thing that helped me a lot in this regard was becoming an expert in a complex field. Seeing how very intelligent people form deeply wrong opinions about things I understand made me very, very aware of similar biases in my thinking about other fields. It didn’t cure me from forming such opinions, but it does force me to reexamine them aggressively as soon as I realize their existence within my mind.