Do people think in a Bayesian or Popperian way?

People think A&B is more likely than A alone, if you ask the right question. That’s not very Bayesian; as far as you Bayesians can tell it’s really quite stupid.
Is that maybe evidence that Bayesianism is faililng to model how people actually thinking?
Popperian philosophy can make sense of this (without hating on everyone! it’s not good to hate on people when there’s better options available). It explains it like this: people like explanations. When you say “A happened because B happened” it sounds to them like a pretty good explanatory theory which makes sense. When you say “A alone” they don’t see any explanation and they read it as “A happened for no apparent reason” which is a bad explanation, so they score it worse.
To concretize this, you could use A = economic collapse and B = nuclear war.
People are looking for good explanations. They are thinking in a Popperian fashion.
Isn’t it weird how you guys talk about all these biases which basically consist of people not thinking in the way you think they should, but when someone says “hey, actually they think in this way Popper worked out” you think that’s crazy cause the Bayesian model must be correct? Why did you find all these counter examples to your own theory and then never notice they mean your theory is wrong? In the cases where people don’t think in a Popperian way, Popper explains why (mostly b/​c of the justificationist tradition informing many mistakes since Aristotle)
More examples, from http://​​wiki.lesswrong.com/​​wiki/​​Bias
Scope Insensitivity—The human brain can’t represent large quantities: an environmental measure that will save 200,000 birds doesn’t conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds.
Changing the number does not change most of the explanations involved, such as why helping birds is good, what the person can afford to spare, how much charity it takes the person to feel altruistic enough (or moral enough, involved enough, helpful enough, whatever), etc… Since the major explanatory factors they were considering don’t change in proportion to the number of birds, their answer doesn’t change proportionally either.
Correspondence Bias, also known as the fundamental attribution error, refers to the tendency to attribute the behavior of others to intrinsic dispositions, while excusing one’s own behavior as the result of circumstance.
This happens because people usually know the explanations/​excuses for why they did stuff, but they don’t know them for others. And they have more reason to think of them for themselves.
Confirmation bias, or Positive Bias is the tendency to look for evidence that confirms a hypothesis, rather than disconfirming evidence.
People do this because of the justificationist tradition, dating back to Aristotle, which Bayesian epistemology is part of, and which Popper rejected. This is a way people really don’t think in the Popperian way—but they could and should.
Planning Fallacy—We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will not go as expected. As a result, we routinely see outcomes worse then the ex ante worst case scenario.
This is also caused by the justificationist tradition, which Bayesian epistemology is part of. It’s not fallibilist enough. This is a way people really don’t think in the Popperian way—but they could and should.
Well, that’s part of the issue. The other part is they come up with a good explanation of what will happen, and they go with that. That part of their thinking fits what Popper said people do. The problem is not enough criticism, which is from the popularity of justificationism.
Do We Believe Everything We’re Told? - Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.
That’s very Popperian. The Popperian way is that you can make conjectures however you want, and you only reject them if there’s a criticism. No criticism, no rejection. This contrasts with the justificationist approach in which ideas are required to (impossibly) have positive support, and the focus is on positive support not criticism (thus causing, e.g., Confirmation Bias)
Illusion of Transparency—Everyone knows what their own words mean, but experiments have confirmed that we systematically overestimate how much sense we are making to others.
This one is off topic but there’s several things I wanted to say. First, people don’t always know what their own words mean. People talking about tricky concepts like God, qualia, or consciousness often can’t explain what they mean by the words if asked. Sometimes people even use words without knowing the definition, they just heard it in a similar circumstance another time or something.
The reason others don’t understand us, often, is because of the nature of communication. To communicate what has to happen is the other person creates knoweldge of what idea(s) you are trying to express to him. That means he has to make guesses about what you are saying and use criticisms to improve those guesses (e.g. by ruling stuff out incompatible with the words he heard you use). In this way Popperian epistemology lets us understand communication, and why it’s so hard.
Evaluability—It’s difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
It’s because they are trying to come up with a good explanation of what to buy. And “this one is better than this other one” is a pretty simple and easily available kind of explanation to create.
The Allais Paradox (and subsequent followups) - Offered choices between gambles, people make decision-theoretically inconsistent decisions.
How do you know that kind of thing and still think people reason in a Bayesian way? They don’t. They just guess at what to gamble, and the quality of the guesses is limited by what criticisms they use. If they dont’ know much math then they don’t subject their guesses to much mathematical criticism. Hence this mistake.