Sometimes people will propose ideas, and then those ideas are met immediately after with harsh criticism. A very common tendency for humans is to defend our ideas and work against these criticisms, which often gets us into a state that people refer to as “defensive.”
According to common wisdom, being in a defensive state is a bad thing. The rationale here is that we shouldn’t get too attached to our own ideas. If we do get attached, we become liable to become crackpots who can’t give an idea up because it would make them look bad if we did. Therefore, the common wisdom advocates treating ideas as being handed to us by a tablet from the clouds rather than a product of our brain’s thinking habits. Taking this advice allows us to detach ourselves from our ideas so that we don’t confuse criticism with insults.
However, I think the exact opposite failure mode is not often enough pointed out and guarded against. Specifically, the failure mode is being too willing to abandon beliefs based on surface level counterarguments. To alleviate this I suggest we shouldn’t be so ready to give up our ideas in the face of criticism.
This might sound irrational—why should we get attached to our beliefs? I’m certainly not advocating that we should actually associate criticism with insults to our character or intelligence. Instead, my argument is that the process of defensively defending against criticism generates a productive adversarial structure.
Consider two people. Person A desperately wants to believe proposition X, and person B desperately wants to believe not X. If B comes up to A and says, “Your belief in X is unfounded. Here are the reasons...” Person A can either admit defeat, or fall into defensive mode. If A admits defeat, they might indeed get closer to the truth. On the other hand, if A gets into defensive mode, they might also get closer to the truth in the process of desperately for evidence of X.
My thesis is this: the human brain is very good at selective searching for evidence. In particular, given some belief that we want to hold onto, we will go to great lengths to justify it, searching for evidence that we otherwise would not have searched for if we were just detached from the debate. It’s sort of like the difference between a debate between two people who are assigned their roles by a coin toss, and a debate between people who have spent their entire lives justifying why they are on one side. The first debate is an interesting spectacle, but I expect the second debate to contain much deeper theoretical insight.
Just like an idea can be wrong, so can be criticism. It is bad to give up the idea, just because..
someone rounded it up to the nearest cliche, and provided the standard cached answer;
someone mentioned a scientific article (that failed to replicate) that disproves your idea (or something different, containing the same keywords);
someone got angry because it seems to oppose their political beliefs;
etc.
My “favorite” version of wrong criticism is when someone experimentally disproves a strawman version of your hypothesis. Suppose your hypothesis is “eating vegetables is good for health”, and someone makes an experiment where people are only allowed to eat carrots, nothing more. After a few months they get sick, and the author of the experiment publishes a study saying “science proves that vegetables are actually harmful for your health”. (Suppose, optimistically, that the author used sufficiently large N, and did the statistics properly, so there is nothing to attack from the methodological angle.) From now on, whenever you mention that perhaps a diet containing more vegetables could benefit someone, someone will send you a link to the article that “debunks the myth” and will consider the debate closed.
So, when I hear about research proving that parenting / education / exercise / whatever doesn’t cause this or that, my first reaction is to wonder how specifically did the researchers operationalize such a general word, and whether the thing they studied even resembles my case.
(And yes, I am aware that the same strategy could be used to refute any inconvenient statement, such as “astrology doesn’t work”—“well, I do astrology a bit differently than the people studied in that experiment, therefore the conclusion doesn’t apply to me”.)
Sometimes people will propose ideas, and then those ideas are met immediately after with harsh criticism. A very common tendency for humans is to defend our ideas and work against these criticisms, which often gets us into a state that people refer to as “defensive.”
According to common wisdom, being in a defensive state is a bad thing. The rationale here is that we shouldn’t get too attached to our own ideas. If we do get attached, we become liable to become crackpots who can’t give an idea up because it would make them look bad if we did. Therefore, the common wisdom advocates treating ideas as being handed to us by a tablet from the clouds rather than a product of our brain’s thinking habits. Taking this advice allows us to detach ourselves from our ideas so that we don’t confuse criticism with insults.
However, I think the exact opposite failure mode is not often enough pointed out and guarded against. Specifically, the failure mode is being too willing to abandon beliefs based on surface level counterarguments. To alleviate this I suggest we shouldn’t be so ready to give up our ideas in the face of criticism.
This might sound irrational—why should we get attached to our beliefs? I’m certainly not advocating that we should actually associate criticism with insults to our character or intelligence. Instead, my argument is that the process of defensively defending against criticism generates a productive adversarial structure.
Consider two people. Person A desperately wants to believe proposition X, and person B desperately wants to believe not X. If B comes up to A and says, “Your belief in X is unfounded. Here are the reasons...” Person A can either admit defeat, or fall into defensive mode. If A admits defeat, they might indeed get closer to the truth. On the other hand, if A gets into defensive mode, they might also get closer to the truth in the process of desperately for evidence of X.
My thesis is this: the human brain is very good at selective searching for evidence. In particular, given some belief that we want to hold onto, we will go to great lengths to justify it, searching for evidence that we otherwise would not have searched for if we were just detached from the debate. It’s sort of like the difference between a debate between two people who are assigned their roles by a coin toss, and a debate between people who have spent their entire lives justifying why they are on one side. The first debate is an interesting spectacle, but I expect the second debate to contain much deeper theoretical insight.
A couple of relevant posts/threads that come to mind:
Individual vs. Group Epistemic Rationality
Raemon’s recent shortform on adversarial debates producing positive externalities
Just like an idea can be wrong, so can be criticism. It is bad to give up the idea, just because..
someone rounded it up to the nearest cliche, and provided the standard cached answer;
someone mentioned a scientific article (that failed to replicate) that disproves your idea (or something different, containing the same keywords);
someone got angry because it seems to oppose their political beliefs;
etc.
My “favorite” version of wrong criticism is when someone experimentally disproves a strawman version of your hypothesis. Suppose your hypothesis is “eating vegetables is good for health”, and someone makes an experiment where people are only allowed to eat carrots, nothing more. After a few months they get sick, and the author of the experiment publishes a study saying “science proves that vegetables are actually harmful for your health”. (Suppose, optimistically, that the author used sufficiently large N, and did the statistics properly, so there is nothing to attack from the methodological angle.) From now on, whenever you mention that perhaps a diet containing more vegetables could benefit someone, someone will send you a link to the article that “debunks the myth” and will consider the debate closed.
So, when I hear about research proving that parenting / education / exercise / whatever doesn’t cause this or that, my first reaction is to wonder how specifically did the researchers operationalize such a general word, and whether the thing they studied even resembles my case.
(And yes, I am aware that the same strategy could be used to refute any inconvenient statement, such as “astrology doesn’t work”—“well, I do astrology a bit differently than the people studied in that experiment, therefore the conclusion doesn’t apply to me”.)