My take is that this does need to be addressed, but it should be done very carefully so as not to make the dynamic worse.
I have many post drafts on this topic. I haven’t published any because I’m very much afraid of making the tribal conflict worse, or of being ostracized from one or both tribes.
Here’s an off-the-cuff attempt to address the dynamics without pointing any fingers or even naming names. It might be too abstract to serve the purposes you have in mind, but hopefully it’s at least relevant to the issue.
I think it’s wise (or even crucial) to be quite careful, polite, and generous when addressing views you disagree with on alignment. Failing to do so runs a large risk that your arguments will backfire and delay converging on the truth of crucial matters. Strongly worded arguments can engage emotions and ideological affiliations. The field of alignment may not have the leeway for internal conflict distorting our beliefs and distracting us from making rapid progress.
I do think it would be useful to address those tribal-ish dynamics, because I think they’re not just distorting the discussions, they’re distorting our individual epistemics. I think motivated reasoning is a powerful force, in conjunction with cognitive limitations that limit us from weighing all evidence and arguments in complex domains.
I’m less worried about naming the groups than I am about causing more logic-distorting, emotional reactions by speaking ill of dearly-held beliefs, arguments, and hopes. When naming the group dynamics, it might be helpful to stress individual variations, e.g. “individuals with more of the empiricist(theorist) outlook”
In most of society, arguments don’t do much to change beliefs. It’s better in more logical/rational/empirically leaning subcultures like LessWrong, but we shouldn’t assume we’re immune to emotions distorting our reasoning. Forceful arguments are often implicitly oppositional, confrontational, and insulting, and so have blowback effects that can entrench existing views and ignite tribal conflicts.
Science gets past this on average, given enough time. But the aphorism “science progresses one funeral at a time” should be chilling in this field.
We probably don’t have that long to solve alignment, so we’ve got to do better than traditional science. The alignment community is much more aware of and concerned with communication and emotional dynamics than the field I emigrated from, and probably from most other sciences. So I think we can do much better if we try.
Steve Byrnes’ Valence sequence is not directly about tribal dynamics, but it is indirectly quite relevant. It’s about the psychological mechanisms that tie idea, arguments, and group identities to emotional responses (it focuses on valence but the same steering system mechanisms apply to other specific emotional responses as well). It’s not a quick read, but it’s a fascinating lens for analyzing why we believe what we do.
My take is that this does need to be addressed, but it should be done very carefully so as not to make the dynamic worse.
I have many post drafts on this topic. I haven’t published any because I’m very much afraid of making the tribal conflict worse, or of being ostracized from one or both tribes.
Here’s an off-the-cuff attempt to address the dynamics without pointing any fingers or even naming names. It might be too abstract to serve the purposes you have in mind, but hopefully it’s at least relevant to the issue.
I think it’s wise (or even crucial) to be quite careful, polite, and generous when addressing views you disagree with on alignment. Failing to do so runs a large risk that your arguments will backfire and delay converging on the truth of crucial matters. Strongly worded arguments can engage emotions and ideological affiliations. The field of alignment may not have the leeway for internal conflict distorting our beliefs and distracting us from making rapid progress.
I do think it would be useful to address those tribal-ish dynamics, because I think they’re not just distorting the discussions, they’re distorting our individual epistemics. I think motivated reasoning is a powerful force, in conjunction with cognitive limitations that limit us from weighing all evidence and arguments in complex domains.
I’m less worried about naming the groups than I am about causing more logic-distorting, emotional reactions by speaking ill of dearly-held beliefs, arguments, and hopes. When naming the group dynamics, it might be helpful to stress individual variations, e.g. “individuals with more of the empiricist(theorist) outlook”
In most of society, arguments don’t do much to change beliefs. It’s better in more logical/rational/empirically leaning subcultures like LessWrong, but we shouldn’t assume we’re immune to emotions distorting our reasoning. Forceful arguments are often implicitly oppositional, confrontational, and insulting, and so have blowback effects that can entrench existing views and ignite tribal conflicts.
Science gets past this on average, given enough time. But the aphorism “science progresses one funeral at a time” should be chilling in this field.
We probably don’t have that long to solve alignment, so we’ve got to do better than traditional science. The alignment community is much more aware of and concerned with communication and emotional dynamics than the field I emigrated from, and probably from most other sciences. So I think we can do much better if we try.
Steve Byrnes’ Valence sequence is not directly about tribal dynamics, but it is indirectly quite relevant. It’s about the psychological mechanisms that tie idea, arguments, and group identities to emotional responses (it focuses on valence but the same steering system mechanisms apply to other specific emotional responses as well). It’s not a quick read, but it’s a fascinating lens for analyzing why we believe what we do.