A contemporary example of inadequate ethical heuristics: Public discussion of group differences
I think the word “public” is doing a lot of heavy lifting here. I can think sanely about group differences on my own, and it would be easy to have that conversation with you. I’d also expect to be able to handle it with individuals who are strongly attached to one heuristic or the other, and perhaps even in small groups. I wouldn’t tweet about it. Nor would I strike up that conversation in a crowded room of strangers.
The problem isn’t that I can’t think sanely about such topics, but that they can’t—and because descriptively speaking “they” and “I” is a better descriptor than “we” in those cases. In smaller scales, thinking as “we” is easier. If I say something that my close friend would have disagreed with, they know me well enough to appropriately weight the fact that I’m saying it, and that immediately changes the joint perception as “we” in ways that tweeting does not. And when the friend doesn’t buy it, the amount of “Hm, maybe I’m wrong” is very manageable, so engaging with the kind of humility that makes it “shared exploration towards truth” rather than “an attempt to manipulate” is easy. Even “bold” statements like “You’re doing it all wrong [and I know this because there’s nothing you’ve thought of that I haven’t considered which could justify a change of mind]” are fairly achievable in this context because you do know your friend fairly well, and they know you know them fairly well, etc.
Try to step up in scale though, and it gets tougher. Either you have to back off a bit because maybe the larger group knows more things you don’t, or you have to know more things including how more distant strangers (incorrectly) model things. As the group you’re trying to move becomes larger and more powerful, the push back you invite becomes louder and more meaningful. Have you found the courage to pick a fight that big, and still humble yourself appropriately, if necessary? Because if not then you’re setting yourself up to fold prematurely, before standing strong enough to evoke the kind of evidence that would genuinely change your mind.
An analogy that comes to mind is “injection locking”, as demonstrated by groups of metronomes synching up. Couple yourself too tightly to a large coordinated mass, and you’re likely to find your heart compelled to the same wavelength, even as you recognize that it’s wrong (whoops, there goes your sanity). Decouple too much, and even if you’re not missing anything from the larger group, you’re not helping the group either. The trick is to regulate your coupling such that you can both influence and be influenced in ways that are tracking truth, without losing track of genuinely valuable wavelengths you’ve entrained to.
And if you try to try to “cheat” and preserve your frequency by pushing on the group without opening yourself to push back, that’s a good definition of manipulation, and when the group notices it will backfire.
I think it’s important that “we” think carefully about what group we can really “we” with, without losing lock on reality, and updating in ways that shut out information rather than incorporating more information. And now that I think of it, the problem of how to scale up seems to be missing an ethical design pattern itself. There’s not a lot of good guidance at how quickly to try to integrate with larger groups of metronomes.
Jordan Peterson took a crack at it with “Clean your room”/”Set your house in perfect order before you criticize the world”, but that’s more of a counter heuristic than a bridge. And overly moralistic and unachievable. In totally unrelated news, “Jordan Peterson” has become a scissor statement.
In short form, I’d probably phrase it like “Be careful to match your ambition with humility and courage, and scale only as fast as you dare”
Interesting. I like the “metronomes syncing” metaphor. It evokes the same feeling for me as a cloud of chaotically spinning dust collapsing into a solar system with roughly one axis of spin. It also reminds me of my “Map Articulating All Talking” (MAAT) concept. I’m planning to write up a post about it, but until then this comment thread is where I’ve written the most about it. The basic idea is that currently it is impossible to communicate with groups of humans sensibly and a new social media platform would solve this issue. (lol, ambitious I know.)
The size of the “we” is critically important. Communism can occasionally work in a small enough group where everyone knows everyone, but scaling it up to a country requires different group coordination methods to succeed.
I think the word “public” is doing a lot of heavy lifting here. I can think sanely about group differences on my own, and it would be easy to have that conversation with you. I’d also expect to be able to handle it with individuals who are strongly attached to one heuristic or the other, and perhaps even in small groups. I wouldn’t tweet about it. Nor would I strike up that conversation in a crowded room of strangers.
The problem isn’t that I can’t think sanely about such topics, but that they can’t—and because descriptively speaking “they” and “I” is a better descriptor than “we” in those cases. In smaller scales, thinking as “we” is easier. If I say something that my close friend would have disagreed with, they know me well enough to appropriately weight the fact that I’m saying it, and that immediately changes the joint perception as “we” in ways that tweeting does not. And when the friend doesn’t buy it, the amount of “Hm, maybe I’m wrong” is very manageable, so engaging with the kind of humility that makes it “shared exploration towards truth” rather than “an attempt to manipulate” is easy. Even “bold” statements like “You’re doing it all wrong [and I know this because there’s nothing you’ve thought of that I haven’t considered which could justify a change of mind]” are fairly achievable in this context because you do know your friend fairly well, and they know you know them fairly well, etc.
Try to step up in scale though, and it gets tougher. Either you have to back off a bit because maybe the larger group knows more things you don’t, or you have to know more things including how more distant strangers (incorrectly) model things. As the group you’re trying to move becomes larger and more powerful, the push back you invite becomes louder and more meaningful. Have you found the courage to pick a fight that big, and still humble yourself appropriately, if necessary? Because if not then you’re setting yourself up to fold prematurely, before standing strong enough to evoke the kind of evidence that would genuinely change your mind.
An analogy that comes to mind is “injection locking”, as demonstrated by groups of metronomes synching up. Couple yourself too tightly to a large coordinated mass, and you’re likely to find your heart compelled to the same wavelength, even as you recognize that it’s wrong (whoops, there goes your sanity). Decouple too much, and even if you’re not missing anything from the larger group, you’re not helping the group either. The trick is to regulate your coupling such that you can both influence and be influenced in ways that are tracking truth, without losing track of genuinely valuable wavelengths you’ve entrained to.
And if you try to try to “cheat” and preserve your frequency by pushing on the group without opening yourself to push back, that’s a good definition of manipulation, and when the group notices it will backfire.
I think it’s important that “we” think carefully about what group we can really “we” with, without losing lock on reality, and updating in ways that shut out information rather than incorporating more information. And now that I think of it, the problem of how to scale up seems to be missing an ethical design pattern itself. There’s not a lot of good guidance at how quickly to try to integrate with larger groups of metronomes.
Jordan Peterson took a crack at it with “Clean your room”/”Set your house in perfect order before you criticize the world”, but that’s more of a counter heuristic than a bridge. And overly moralistic and unachievable. In totally unrelated news, “Jordan Peterson” has become a scissor statement.
In short form, I’d probably phrase it like “Be careful to match your ambition with humility and courage, and scale only as fast as you dare”
Interesting. I like the “metronomes syncing” metaphor. It evokes the same feeling for me as a cloud of chaotically spinning dust collapsing into a solar system with roughly one axis of spin. It also reminds me of my “Map Articulating All Talking” (MAAT) concept. I’m planning to write up a post about it, but until then this comment thread is where I’ve written the most about it. The basic idea is that currently it is impossible to communicate with groups of humans sensibly and a new social media platform would solve this issue. (lol, ambitious I know.)
The size of the “we” is critically important. Communism can occasionally work in a small enough group where everyone knows everyone, but scaling it up to a country requires different group coordination methods to succeed.