...beliefs are like clothes. In a harsh environment, we choose our clothes mainly to be functional, i.e., to keep us safe and comfortable. But when the weather is mild, we choose our clothes mainly for their appearance, i.e., to show our figure, our creativity, and our allegiances. Similarly, when the stakes are high we may mainly want accurate beliefs to help us make good decisions. But when a belief has few direct personal consequences, we in effect mainly care about the image it helps to project.
Your contrarian stance against a high-status member of this community makes you seem formidable and savvy. Would you like to be allies with me? If yes, then the next time I go foraging I will bring you back extra fruit.
I’d say it’s such a broad subject that there have to be some screws in there as well. I think Hanson has too much faith in the ability of evolved systems to function in a radically changed environment. Even if signaling dominates the evolutionary origins of our brain, it’s not advisable to just label everything we do now as directed towards signaling, any more than sex is always directed towards reproduction. You have to get into the nitty gritty of how our minds carry out the signaling. Conspiracy theorists don’t signal effectively, though you can probably relate their behavior back to mechanisms originally directed towards, or at least compatible with, signaling.
Also, an ability to switch between clear “near” thinking and fluffy “far” thinking presupposes a rational decision maker to implement the switch. I’m not sure Hanson pays enough attention to how, when, and for what reasons we do this.
Beliefs serve multiple functions. One is modeling accuracy, another is signaling. It’s not whether the environment is harsh or easy, it’s which function you need. There are many harsh environments where what you need is the signaling function, and not the modeling function.
I think the quote reflects reality (humans aren’t naturally rational so their beliefs are conditioned by circumstance), but is better seen as an observation than a recommendation. The best approach should always be to hold maximally accurate beliefs yourself, even if you choose to signal different ones as the situation demands. That way you can gain the social benefits of professing a false belief without letting it warp or distort your predictions.
The best approach should always be to hold maximally accurate beliefs yourself, even if you choose to signal different ones as the situation demands.
No, that wouldn’t necessarily be the case. We should expect a cost in effort and effectiveness to try to switch on the fly between the two types of truths. Lots of far truths have little direct predictive value, but lots of signaling value. Why bear the cost for a useless bit of predictive truth, particularly if it is worse than useless and hampers signaling?
That’s part of the magic of magisteria—segregation of modes of truth by topic reduces that cost.
Hmm, maybe I shouldn’t have said “always” given that acting ability is required to signal a belief you don’t hold, but I do think what I suggest is the ideal. I think someone who trained themselves to do what I suggest, by studying people skills and so forth, would do better as they’d get the social benefits of conformity and without the disadvantages of false beliefs clouding predictions (though admittedly the time investment of learning these skills would have to be considered).
Short version: I think this is possible with training and would make you “win” more often, and thus it’s what a rationalist would do (unless the cost of training proved prohibitive, of which I’m doubtful since these skills are very transferable).
I’m not sure what you meant by the magisteria remark, but I get the impression that advocating spiritual/long-term beliefs to less stringent standards than short term ones isn’t generally seen as a good thing (see Eliezer’s “Outside the Laboratory” post among others).
-Robin Hanson, Human Enhancement
I feel like Hanson’s admittedly insightful “signaling” hammer has him treating everything as a nail.
Your contrarian stance against a high-status member of this community makes you seem formidable and savvy. Would you like to be allies with me? If yes, then the next time I go foraging I will bring you back extra fruit.
I agree in principle but I think this particular topic is fairly nailoid in nature.
I’d say it’s such a broad subject that there have to be some screws in there as well. I think Hanson has too much faith in the ability of evolved systems to function in a radically changed environment. Even if signaling dominates the evolutionary origins of our brain, it’s not advisable to just label everything we do now as directed towards signaling, any more than sex is always directed towards reproduction. You have to get into the nitty gritty of how our minds carry out the signaling. Conspiracy theorists don’t signal effectively, though you can probably relate their behavior back to mechanisms originally directed towards, or at least compatible with, signaling.
Also, an ability to switch between clear “near” thinking and fluffy “far” thinking presupposes a rational decision maker to implement the switch. I’m not sure Hanson pays enough attention to how, when, and for what reasons we do this.
Same here.
I think he’s mischaracterizing the issue.
Beliefs serve multiple functions. One is modeling accuracy, another is signaling. It’s not whether the environment is harsh or easy, it’s which function you need. There are many harsh environments where what you need is the signaling function, and not the modeling function.
I think the quote reflects reality (humans aren’t naturally rational so their beliefs are conditioned by circumstance), but is better seen as an observation than a recommendation. The best approach should always be to hold maximally accurate beliefs yourself, even if you choose to signal different ones as the situation demands. That way you can gain the social benefits of professing a false belief without letting it warp or distort your predictions.
No, that wouldn’t necessarily be the case. We should expect a cost in effort and effectiveness to try to switch on the fly between the two types of truths. Lots of far truths have little direct predictive value, but lots of signaling value. Why bear the cost for a useless bit of predictive truth, particularly if it is worse than useless and hampers signaling?
That’s part of the magic of magisteria—segregation of modes of truth by topic reduces that cost.
Hmm, maybe I shouldn’t have said “always” given that acting ability is required to signal a belief you don’t hold, but I do think what I suggest is the ideal. I think someone who trained themselves to do what I suggest, by studying people skills and so forth, would do better as they’d get the social benefits of conformity and without the disadvantages of false beliefs clouding predictions (though admittedly the time investment of learning these skills would have to be considered).
Short version: I think this is possible with training and would make you “win” more often, and thus it’s what a rationalist would do (unless the cost of training proved prohibitive, of which I’m doubtful since these skills are very transferable).
I’m not sure what you meant by the magisteria remark, but I get the impression that advocating spiritual/long-term beliefs to less stringent standards than short term ones isn’t generally seen as a good thing (see Eliezer’s “Outside the Laboratory” post among others).
Clothes serve multiple functions. One is keeping warm, another is signalling.