This argument has already been conclusively refuted by the Sequence entry: Doublethink: Choosing To Be Biased.
Eliezer points out that it is impossible to have false beliefs only when they’re beneficial to you.
You can’t know the consequences of being biased, until you have already debiased yourself. And then it is too late for self-deception.
The other alternative is to choose blindly to remain biased, without any clear idea of the consequences. This is not second-order rationality. It is willful stupidity.
Be irrationally optimistic about your driving skills, and you will be happily unconcerned where others sweat and fear. You won’t have to put up with the inconvenience of a seat belt. You will be happily unconcerned for a day, a week, a year. Then crash, and spend the rest of your life wishing you could scratch the itch in your phantom limb. Or paralyzed from the neck down. Or dead. It’s not inevitable, but it’s possible; how probable is it? You can’t make that tradeoff rationally unless you know your real driving skills, so you can figure out how much danger you’re placing yourself in. You can’t make that tradeoff rationally unless you know about biases like neglect of probability.
It is likely, in your scenario, that Gina has other false beliefs that harm her just as much as her optimism helps her.
That’s just what Eliezer is pointing at with “Choosing to be Biased”. In practice, it’s impossible to determine causality between beliefs and “success”. There are lots of successful people with all kinds of utterly insane beliefs. Were their insane beliefs the cause of their success? Or were these people so talented that they were able to achieve success despite those beliefs?
In contrast, determining whether a belief results in good predictions about the world is quite easy, and gets you many of the benefits without any of the potentially catastrophic costs.