The point is that when someone is exploring / testing an idea, it might be better for them to explore the region of small updates around the original proposal, instead of easily giving up and trying something completely different. Many ideas fail because of small details that were gotten wrong. When criticism is too harsh, it prevents people from doing even this. They might instead just keep proposing something close to what’s already being tried. This is how you actually end up in a local minima.
Not sure if this is an example of what you mean, but similar discussions remind me of Ignaz Semmelweis—he was a doctor who noticed that some patients are disproportially dying, he made a hypothesis about the possible cause, he changed his behavior based on the hypothesis, and he experimentally observed that now fewer patients are dying.
What happened next? The medical community found a technical mistake in his hypothesis, and instead of saying “well, this is technically wrong, but it also seems to be approximately right, we should explore the neighborhood of this hypothesis”, they focused on getting the man fired. Later, the correct hypothesis (germ theory) was actually found in the neighborhood of his hypothesis.
I understood your article to mean that this is the type of mistake we should avoid. That we should be able to explore the neighborhood of the “technically wrong” ideas. (Without going to the opposite extreme and accepting all ideas indiscriminately, of course.)
This also seems related to the idea of “steelmanning”; instead of finding a small mistake and dismissing the whole thing, try to fix the mistake, and think again about the stronger version of the original hypothesis.
My heuristics is: “assume that people’s reported experiences are usually true, but their interpretations and conclusions are usually wrong”. Not used for science, but for everyday life. And assumes basic trustworthiness of the person, i.e. not having previous experience that the person misreports facts.
The point is that when someone is exploring / testing an idea, it might be better for them to explore the region of small updates around the original proposal, instead of easily giving up and trying something completely different. Many ideas fail because of small details that were gotten wrong. When criticism is too harsh, it prevents people from doing even this. They might instead just keep proposing something close to what’s already being tried. This is how you actually end up in a local minima.
Not sure if this is an example of what you mean, but similar discussions remind me of Ignaz Semmelweis—he was a doctor who noticed that some patients are disproportially dying, he made a hypothesis about the possible cause, he changed his behavior based on the hypothesis, and he experimentally observed that now fewer patients are dying.
What happened next? The medical community found a technical mistake in his hypothesis, and instead of saying “well, this is technically wrong, but it also seems to be approximately right, we should explore the neighborhood of this hypothesis”, they focused on getting the man fired. Later, the correct hypothesis (germ theory) was actually found in the neighborhood of his hypothesis.
I understood your article to mean that this is the type of mistake we should avoid. That we should be able to explore the neighborhood of the “technically wrong” ideas. (Without going to the opposite extreme and accepting all ideas indiscriminately, of course.)
This also seems related to the idea of “steelmanning”; instead of finding a small mistake and dismissing the whole thing, try to fix the mistake, and think again about the stronger version of the original hypothesis.
I’ll throw in a heuristic that I’ve seen Tyler Cowen use:
Instead of pointing out that X is wrong, ask under which set of circumstances might X be right.
My heuristics is: “assume that people’s reported experiences are usually true, but their interpretations and conclusions are usually wrong”. Not used for science, but for everyday life. And assumes basic trustworthiness of the person, i.e. not having previous experience that the person misreports facts.