Human inability to assign numerical probabilities

Whenever we talk about the probability of an event that we do not have perfect information about, we generally use qualitative descriptions (e.g. possible but improbable). When we do use numbers, we usually just stick to a probability range (e.g. 14 to 13). A Bayesian should be able to assign a probability estimate to any well-defined hypothesis. For a human, trying to assign a numerical probability estimate is uncomfortable and seems arbitrary. Even when we can give a probability range, we resist averaging the probabilities we expect. For instance, I’d say that Republicans are more likely than not to take over the House, but the Democrats still have a chance. After pressing myself, I managed to say that the probability of the Democratic party keeping control of the House next election is somewhere between 25% and 40%. Condensing this to 32.5% just feels wrong and arbitrary. Why is this? I have thought of three possible reasons, which I listed in order of likeliness:

Maybe our brains are just built like frequentists. If we innately think of probability of probabilities of being properties of hypotheses, it makes sense that we would not give an exact probability. If this is correct, it would mean that the tendency to think in frequentist terms is too entrenched to be easily untrained, as I try to think like a Bayesian, and yet still suffer from this effect, and I suspect the same is true of most of you.

Maybe since our ancestors never needed to express numerical probabilities, our brains never developed the ability to. Even if we have data spaces in our brains to represent probabilities of hypotheses, it could be buried in the decision-making portion of our brains, and the signal could get garbled when we try to pull it out in verbal form. However, we also get uncomfortable when forced to make important decisions on limited information, which would be evidence against this.

Maybe there is selection pressure against giving specific answers because it makes it harder to inflate your accuracy after the fact, resulting in lower status. This seems highly unlikely, but since I thought of it, I felt compelled to write it anyway.

As there are people on this forum who actually know a thing or two about cognitive science, I expect I’ll get some useful responses. Discuss.

Edit: I did not mean to imply that it is wise for humans to give a precise probability estimate, only that a Bayesian would but we don’t.