You bring up a good point that profitable trading doesn’t imply a globally better calibrated worldview, just a locally better calibrated one.
In your example:
If someone thinks that if it’s raining, the home team will win 100% of the time, then they will bet too much size.
Betting “sensibly relative to their bankroll” is 100% of your bankroll if you truly think the chance of something occurring is 100%.
Revealed belief vs. Preferred belief.
100% + size accordingly ⇒ eventual ruin via ergodicity
100% + size small ⇒ they don’t actually believe 100%; their behavior encodes meta-uncertainty, and optimal sizing assuming long run profitability converges to the true conditional edge after costs.
You bring up a good point that profitable trading doesn’t imply a globally better calibrated worldview, just a locally better calibrated one.
In your example:
If someone thinks that if it’s raining, the home team will win 100% of the time, then they will bet too much size.
Betting “sensibly relative to their bankroll” is 100% of your bankroll if you truly think the chance of something occurring is 100%.
Revealed belief vs. Preferred belief.
100% + size accordingly ⇒ eventual ruin via ergodicity
100% + size small ⇒ they don’t actually believe 100%; their behavior encodes meta-uncertainty, and optimal sizing assuming long run profitability converges to the true conditional edge after costs.