Hi,
I’ve been lurking on LW/OB for a while but thought I’d sign up. I’m currently doing a philosophy degree which you might expect would make me feel unwelcome on LW (which is often fairly anti-philosophy) but it’s actually really great to come across a group with a similar view about how to do philosophy as me—I tend to come across more interesting philosophy ideas here than I do in classes.
Anyway, just thought I’d say hello.
With the gamblers, there could be two factors fighting it out:
1.) Maybe you’re more likely to gamble lots in the first place if you’re badly calibrated because you think you’re more likely to win than you actually are.
2.) But once you gamble regularly (and if you keep doing so) you might need to develop the ability to predict the future states of games.
No 2 seems to explain the calibration with regards to future events. Maybe gamblers are more likely to display overconfidence with regards to other calibration tasks though as this overconfidence helps explain why they would choose to gamble.
Just a guess though, no solid reason to believe it.