For some reason “correcting” people’s reasoning was important enough in the ancestral environment to be special-cased in motivation hardware.
I agree that this is a “natural” urge, but only in-tribe. It can be conditioned to be arbitrarily weak, for example by changing how much we care about these people. In other words, if you can imagine that their being wrong does not affect your well-being or the well-being of your group, you usually don’t have nearly as much urge to correct them. If you imagine that their errors are to your benefit, you will want them to stay wrong.
I agree that this is a “natural” urge, but only in-tribe. It can be conditioned to be arbitrarily weak, for example by changing how much we care about these people. In other words, if you can imagine that their being wrong does not affect your well-being or the well-being of your group, you usually don’t have nearly as much urge to correct them. If you imagine that their errors are to your benefit, you will want them to stay wrong.
Oh yeah, it can definitely be turned off, which is the desired consequence of this post.