So if you have learned a thousand ways that humans fall into error and read a hundred experimental results in which anonymous subjects are humiliated of their overconfidence—heck, even if you’ve just read a couple of dozen—and you don’t know exactly how overconfident you are—then yes, you might genuinely be in danger of nudging yourself a step too far down.
I also observed this phenomenon of debiasing being over-emphasized in discussions of rationality, while heuristic is treated as a bad word. I tried to get at the problem of passing up opportunities you mention when I said in my post on heuristic: “It’s a mistake in cartography to have areas of your map that are filled in wrong, but it’s also a mistake to have areas on your map blank that you could have filled in, at least with something approximate”.
I think we need more success stories of human heuristic. Currently, the glut of information on biases and faulty heuristic is making these more cognitively available, leading to underconfidence.
Of course, it’s easier to measure the gravity of mistakes of overconfidence, because we know the bad outcome, and we can speculate that it would have been avoided without the overconfidence. Yet in the case of mistakes of underconfidence, we don’t know what we are missing out on, what brilliant theories were prematurely discarded, and what groundbreaking inventions were never created, because the creators (or their colleagues, investors, advisors, professors, whoever) were underconfident.
Yet we can look at examples of great discoveries, ideas, solutions, practices, and what what our lives, or the world, would be like if they had got nipped in the bud. Furthermore, there may be cases where two people (say, scientists or entrepreneurs) were both acquainted with the same evidence or theory, yet only one was confident enough about it to capitalize on it.
Eliezer said:
I also observed this phenomenon of debiasing being over-emphasized in discussions of rationality, while heuristic is treated as a bad word. I tried to get at the problem of passing up opportunities you mention when I said in my post on heuristic: “It’s a mistake in cartography to have areas of your map that are filled in wrong, but it’s also a mistake to have areas on your map blank that you could have filled in, at least with something approximate”.
I think we need more success stories of human heuristic. Currently, the glut of information on biases and faulty heuristic is making these more cognitively available, leading to underconfidence.
Of course, it’s easier to measure the gravity of mistakes of overconfidence, because we know the bad outcome, and we can speculate that it would have been avoided without the overconfidence. Yet in the case of mistakes of underconfidence, we don’t know what we are missing out on, what brilliant theories were prematurely discarded, and what groundbreaking inventions were never created, because the creators (or their colleagues, investors, advisors, professors, whoever) were underconfident.
Yet we can look at examples of great discoveries, ideas, solutions, practices, and what what our lives, or the world, would be like if they had got nipped in the bud. Furthermore, there may be cases where two people (say, scientists or entrepreneurs) were both acquainted with the same evidence or theory, yet only one was confident enough about it to capitalize on it.