Eliezer points out that it is impossible to have false beliefs only when they’re beneficial to you.
You can’t know the consequences of being biased, until you have already debiased yourself. And then it is too late for self-deception.
The other alternative is to choose blindly to remain biased, without any clear idea of the consequences. This is not second-order rationality. It is willful stupidity.
Be irrationally optimistic about your driving skills, and you will be happily unconcerned where others sweat and fear. You won’t have to put up with the inconvenience of a seat belt. You will be happily unconcerned for a day, a week, a year. Then crash, and spend the rest of your life wishing you could scratch the itch in your phantom limb. Or paralyzed from the neck down. Or dead. It’s not inevitable, but it’s possible; how probable is it? You can’t make that tradeoff rationally unless you know your real driving skills, so you can figure out how much danger you’re placing yourself in. You can’t make that tradeoff rationally unless you know about biases like neglect of probability.
It is likely, in your scenario, that Gina has other false beliefs that harm her just as much as her optimism helps her.
Eliezer points out that it is impossible to have false beliefs only when they’re beneficial to you.
In theory it’s possible: look at what beliefs cause success, and adopt them. (Assuming you’re really good at determining causality between beliefs and success, really good at avoiding checking whether these beliefs are true, and really good at installing beliefs in your mind—but let’s assume the least convenient possible world for the sake of argument, where you are indeed really good at those things.)
Now whether our world looks like that or not is another question...
That’s just what Eliezer is pointing at with “Choosing to be Biased”. In practice, it’s impossible to determine causality between beliefs and “success”. There are lots of successful people with all kinds of utterly insane beliefs. Were their insane beliefs the cause of their success? Or were these people so talented that they were able to achieve success despite those beliefs?
In contrast, determining whether a belief results in good predictions about the world is quite easy, and gets you many of the benefits without any of the potentially catastrophic costs.
I like that post too but it occurs to me that it hinges on the process by which the person chooses how most beneficially to be biased being the person themselves unbiasedly (or less-biasedly) evaluating the alternatives. In which case you then get the tension where that part of that person does actually know the truth and so on.
In particular, it seems to me that it’s still possible to have the choice made for you to tend to have some kinds of false beliefs in a way that predictably correlates with benefitting you (although not necessarily in the form making you happy). Speaking, of course, of natural evolution equipping people with a variety of kinds of social and cognitive biases, as well as predisposition to beliefs around religion, tribal stuff, etc. Far from expert in all the various biases and the research on such, but at least some of them I would expect aren’t just failures of bounded rationality or unwanted misgeneralization, but are/were often socially adaptive (at least locally speaking given bounded rationality, and for the average/typical person, and perhaps more often in much more ancient historical environments that aren’t as evolutionarily out of distribution as the modern world.)
This argument has already been conclusively refuted by the Sequence entry: Doublethink: Choosing To Be Biased.
Eliezer points out that it is impossible to have false beliefs only when they’re beneficial to you.
It is likely, in your scenario, that Gina has other false beliefs that harm her just as much as her optimism helps her.
In theory it’s possible: look at what beliefs cause success, and adopt them. (Assuming you’re really good at determining causality between beliefs and success, really good at avoiding checking whether these beliefs are true, and really good at installing beliefs in your mind—but let’s assume the least convenient possible world for the sake of argument, where you are indeed really good at those things.)
Now whether our world looks like that or not is another question...
That’s just what Eliezer is pointing at with “Choosing to be Biased”. In practice, it’s impossible to determine causality between beliefs and “success”. There are lots of successful people with all kinds of utterly insane beliefs. Were their insane beliefs the cause of their success? Or were these people so talented that they were able to achieve success despite those beliefs?
In contrast, determining whether a belief results in good predictions about the world is quite easy, and gets you many of the benefits without any of the potentially catastrophic costs.
This makes me wonder. Does it mean we can’t determine if true beliefs cause success either? Or is there some important asymmetry?
(I like the post about Doublethink (Choosing to be Biased) but I wouldn’t say it “conclusively refuted” anything)
I like that post too but it occurs to me that it hinges on the process by which the person chooses how most beneficially to be biased being the person themselves unbiasedly (or less-biasedly) evaluating the alternatives. In which case you then get the tension where that part of that person does actually know the truth and so on.
In particular, it seems to me that it’s still possible to have the choice made for you to tend to have some kinds of false beliefs in a way that predictably correlates with benefitting you (although not necessarily in the form making you happy). Speaking, of course, of natural evolution equipping people with a variety of kinds of social and cognitive biases, as well as predisposition to beliefs around religion, tribal stuff, etc. Far from expert in all the various biases and the research on such, but at least some of them I would expect aren’t just failures of bounded rationality or unwanted misgeneralization, but are/were often socially adaptive (at least locally speaking given bounded rationality, and for the average/typical person, and perhaps more often in much more ancient historical environments that aren’t as evolutionarily out of distribution as the modern world.)