Were I to discover that rationality does not lead to winning, or worse yet that irrationality leads to winning, I would find it much more likely that incorrect beliefs enable people to take actions that lead them to winning rather than that the world is made of an explciitly irrational force
The only way I could see this to work is if there’s some force that looks at my model of the world and makes it systematically wrong, no matter how updated it is. That is, only if there’s some anti-bayesian principle at work.
But I think there’s a difference here in what we understood to be rationality: indeed you write
For example, if people believe they are average or below, they may be less aggressive and settle for less than if, by virtue of Dunning-Krueger, they believe they are exceptional and try for more
Let it be clear that I do not conflate being rational with being cautious, being reasonable or even having common sense: I intend it to have the pure meaning of “having the correct model of the world”. If in some endeavour those who try more (aggressively) achieve more, it means that the probability of success is low but not impossibly low, and it follows that the rational thing is to try more.
Those who try less are maybe being prudent, but in so doing they are underestimating their probability of success (or overestimating the probability of failure), that is: they do not have the correct model of the world, and this leads to irrational behaviour.
The second example (hitting the “big idea” or winning at the lottery) is a case in which the winning strategy is uncomputable, but by sheer brute forcing there is someone who will hit it. That’s admittedly a case in which winning was not due to rationality, but note that it wasn’t due to irrationality either: it was due to pure luck of finding oneself in the only global optimum.
I’ll specify better my position: let’s conflate luck with a resource of some kind (it’s a sort of better positioning in a potential space). There are domains in which having the correct model of the world leads to better chance of winning, and there are other domains in which this is indifferent (an impartial beauty contest, a lottery). But there are never domains in which having the correct model of the world leads to a more probable loss.
So rationality leads always to a better or equal probability of winning.
This, as I agreed, is an empirical question, but one which, if defeated, will imply the existence of the irrational force aformentioned.
ETA thinking about it, entering a lottery is an irrational behaviour that leads to winning, but only for the person who will eventually win. So, in the domain of “the possible bet that’s possible to buy”, there is an irrational behaviour that leads to winning. But in this case there is an irrational force that promotes anti-bayesian behaviour: the State (or the casino, etc).
The only way I could see this to work is if there’s some force that looks at my model of the world and makes it systematically wrong, no matter how updated it is. That is, only if there’s some anti-bayesian principle at work. But I think there’s a difference here in what we understood to be rationality: indeed you write
Let it be clear that I do not conflate being rational with being cautious, being reasonable or even having common sense: I intend it to have the pure meaning of “having the correct model of the world”. If in some endeavour those who try more (aggressively) achieve more, it means that the probability of success is low but not impossibly low, and it follows that the rational thing is to try more. Those who try less are maybe being prudent, but in so doing they are underestimating their probability of success (or overestimating the probability of failure), that is: they do not have the correct model of the world, and this leads to irrational behaviour.
The second example (hitting the “big idea” or winning at the lottery) is a case in which the winning strategy is uncomputable, but by sheer brute forcing there is someone who will hit it. That’s admittedly a case in which winning was not due to rationality, but note that it wasn’t due to irrationality either: it was due to pure luck of finding oneself in the only global optimum.
I’ll specify better my position: let’s conflate luck with a resource of some kind (it’s a sort of better positioning in a potential space). There are domains in which having the correct model of the world leads to better chance of winning, and there are other domains in which this is indifferent (an impartial beauty contest, a lottery). But there are never domains in which having the correct model of the world leads to a more probable loss. So rationality leads always to a better or equal probability of winning.
This, as I agreed, is an empirical question, but one which, if defeated, will imply the existence of the irrational force aformentioned.
ETA thinking about it, entering a lottery is an irrational behaviour that leads to winning, but only for the person who will eventually win. So, in the domain of “the possible bet that’s possible to buy”, there is an irrational behaviour that leads to winning. But in this case there is an irrational force that promotes anti-bayesian behaviour: the State (or the casino, etc).