If 90% of the population believes in Religion X, and they all get their opinions on Topic Y from Guru Z, then a naive view of the unilateralist’s curse will say that since Guru Z’s opinion on Topic Y is shared by a majority of the population, action in line with the Guru’s opinion is “multilateral”. Any action Guru Z disagrees with is “unilateral”, even if the remaining 10% of the population worship a broad variety of gods, follow a broad variety of gurus, and are all in agreement that Guru Z happens to be wrong in this case. (BTW see my recent comment on ideological Matthew effects.)
One of Phil Tetlock’s forecasting secrets is something he calls “extremization”: Find forecasters who don’t correlate well with each other in general, and if they actually end up agreeing on something, become even more certain it’s the case. In Bayesian terms, observing conditionally independent evidence counts for more.
The frustrating thing about unilateralist’s curse dialogue is that in order to develop the necessary diversity of perspectives to extremize, you need people to be seriously considering the possibility that the group is wrong, probably playing devil’s advocate, at the very least not feeling pressure to conform intellectually. But that kind of intellectual noncompliance is the very thing a person concerned with the UC will want to clamp down on if they want to prevent unilateral action.
I think the thing to do is separate out talking and acting: People should be developing their own handcrafted models of the world to make extremization possible, but they should be using ensembles of world models to choose important actions.
If 90% of the population believes in Religion X, and they all get their opinions on Topic Y from Guru Z, then a naive view of the unilateralist’s curse will say that since Guru Z’s opinion on Topic Y is shared by a majority of the population, action in line with the Guru’s opinion is “multilateral”. Any action Guru Z disagrees with is “unilateral”, even if the remaining 10% of the population worship a broad variety of gods, follow a broad variety of gurus, and are all in agreement that Guru Z happens to be wrong in this case. (BTW see my recent comment on ideological Matthew effects.)
One of Phil Tetlock’s forecasting secrets is something he calls “extremization”: Find forecasters who don’t correlate well with each other in general, and if they actually end up agreeing on something, become even more certain it’s the case. In Bayesian terms, observing conditionally independent evidence counts for more.
The frustrating thing about unilateralist’s curse dialogue is that in order to develop the necessary diversity of perspectives to extremize, you need people to be seriously considering the possibility that the group is wrong, probably playing devil’s advocate, at the very least not feeling pressure to conform intellectually. But that kind of intellectual noncompliance is the very thing a person concerned with the UC will want to clamp down on if they want to prevent unilateral action.
I think the thing to do is separate out talking and acting: People should be developing their own handcrafted models of the world to make extremization possible, but they should be using ensembles of world models to choose important actions.
Think unilaterally, act multilaterally.