Orthogonally, cultural standards of emotional tone during debates are also important for how much emotional struggle is involved in changing one’s ideas.
If the tone implies that you were foolish for holding your idea, it’s going to be a lot more painful to let it go.
Lesswrong has a pretty good standard of not just civil but polite and supportive discourse. This seems actually pretty crucial for it being an environment in which people do regularly change their minds.
I don’t like the term arena in your suggested division because it implies combat. Combat is emotionally intense, I’d rather have a metaphor that’s more collaborative.
This doesn’t eliminate the worth of having separate spaces for support and rigorous testing of ideas, but I think it’s important to keep in mind whenever we’re discussing group epistemics.
I claim there’s a pareto frontier of epistemic correctness vs emotional kindness. some things, like sneering at people and implying that they are foolish, are pareto suboptimal. but once you achieve pareto optimality, there is a tradeoff between kindness and correctness; and what I think should exist is two distinct spaces on different parts of this tradeoff curve (and of course nobody should do pareto suboptimal things)
once you achieve pareto optimality, there is a tradeoff between kindness and correctness
It’s hard to stay on a pareto frontier, optimizing for more (or less) “kindness” directly is a goodharting hazard. If you ask for something, you might just get poisoned with more of the fake version of it.
I’d prefer less of the sort of “kindness” that trades off with correctness, rather than more of it (even when getting less of it wouldn’t actually help with correctness; it just doesn’t seem like a good thing). But if I ask for that, I’ll end up getting some (subtle) sneering and trolling, or unproductive high-standards elitism that on general principle wants to destroy ideas that didn’t get a chance to grow up yet. Similarly, if you ask for the sort of “kindness” that does trade off with correctness, you’ll end up getting some sycophancy (essentially) that cultivates your errors, making them stronger and more entrenched in your identity, ever more painful and less feasible to eventually defeat (even if there are benign forms of this sort of “kindness” that merely don’t make the problem worse in a comfortable way, as opposed to trying to intervene on it).
I don’t think unsycophantic kindness is quite that difficult to achieve. clearly some groups of people IRL achieve such kindness. generally, people in such communities try to understand each other and why they believe the things they do without judgement in either direction, and affirm the emotional responses to beliefs rather than the beliefs themselves. you don’t have to agree with someone to agree that you’d feel the same in their shoes. somehow, these groups of people don’t inevitably slide into subtle sneering and trolling and sycophancy.
plus, the point of explicitly separating the arena and the antechamber is to make it clear that when you are receiving kindness, you are not receiving updates towards truth. so it is clear to you, and to people around you, that receiving emotional validation in the antechamber is not evidence that your beliefs are correct. it’s valid for people to spend all their time in the antechamber, but everyone will see this, and assign less weight to the truthfulness of their beliefs.
I also don’t think non-sycophantic kindness causes people to dig in to their incorrect beliefs. if anything, it seems more common that people dig into incorrect beliefs because of a sense of adversity against others. think about how much more painful it is to concede a point if your interlocutor is being really mean about it, vs if they are thoughtful and hear you out.
if anything, it seems more common that people dig into incorrect beliefs because of a sense of adversity against others
Consider cults (including milder things like weird “alternative” health advice groups etc.). Positivity and mutual support seem like a key element of their architecture, and adversity often primarily comes from peers rather than an outgroup. I’m not talking about isolated beliefs, content and motivations for those tend to be far more legible. A lot of belief memeplexes have either too few followers or aren’t distinct enough from all the other nonsense to be explicitly labeled as cults or ideologies, or to be organized, but you generally can’t argue their members out of alignment with the group (on relevant beliefs, considered altogether).
the point … is to make it clear that when you are receiving kindness, you are not receiving updates towards truth
This is also a standard piece of anti-epistemic machinery of groups that reinforce some nonsense memplex among themselves with support and positivity. Support and positivity are great, but directing them to systematically taboo correctness-fixing activity is what I’m gesturing at, the sort of “kindness” that by its intent and nature tends to trade off against correctness.
Orthogonally, cultural standards of emotional tone during debates are also important for how much emotional struggle is involved in changing one’s ideas.
If the tone implies that you were foolish for holding your idea, it’s going to be a lot more painful to let it go.
Lesswrong has a pretty good standard of not just civil but polite and supportive discourse. This seems actually pretty crucial for it being an environment in which people do regularly change their minds.
I don’t like the term arena in your suggested division because it implies combat. Combat is emotionally intense, I’d rather have a metaphor that’s more collaborative.
This doesn’t eliminate the worth of having separate spaces for support and rigorous testing of ideas, but I think it’s important to keep in mind whenever we’re discussing group epistemics.
I claim there’s a pareto frontier of epistemic correctness vs emotional kindness. some things, like sneering at people and implying that they are foolish, are pareto suboptimal. but once you achieve pareto optimality, there is a tradeoff between kindness and correctness; and what I think should exist is two distinct spaces on different parts of this tradeoff curve (and of course nobody should do pareto suboptimal things)
I’m not sure it’s that simple. Even if it is, people do suboptimal things all the time. It seems worth watching.
It’s hard to stay on a pareto frontier, optimizing for more (or less) “kindness” directly is a goodharting hazard. If you ask for something, you might just get poisoned with more of the fake version of it.
I’d prefer less of the sort of “kindness” that trades off with correctness, rather than more of it (even when getting less of it wouldn’t actually help with correctness; it just doesn’t seem like a good thing). But if I ask for that, I’ll end up getting some (subtle) sneering and trolling, or unproductive high-standards elitism that on general principle wants to destroy ideas that didn’t get a chance to grow up yet. Similarly, if you ask for the sort of “kindness” that does trade off with correctness, you’ll end up getting some sycophancy (essentially) that cultivates your errors, making them stronger and more entrenched in your identity, ever more painful and less feasible to eventually defeat (even if there are benign forms of this sort of “kindness” that merely don’t make the problem worse in a comfortable way, as opposed to trying to intervene on it).
I don’t think unsycophantic kindness is quite that difficult to achieve. clearly some groups of people IRL achieve such kindness. generally, people in such communities try to understand each other and why they believe the things they do without judgement in either direction, and affirm the emotional responses to beliefs rather than the beliefs themselves. you don’t have to agree with someone to agree that you’d feel the same in their shoes. somehow, these groups of people don’t inevitably slide into subtle sneering and trolling and sycophancy.
plus, the point of explicitly separating the arena and the antechamber is to make it clear that when you are receiving kindness, you are not receiving updates towards truth. so it is clear to you, and to people around you, that receiving emotional validation in the antechamber is not evidence that your beliefs are correct. it’s valid for people to spend all their time in the antechamber, but everyone will see this, and assign less weight to the truthfulness of their beliefs.
I also don’t think non-sycophantic kindness causes people to dig in to their incorrect beliefs. if anything, it seems more common that people dig into incorrect beliefs because of a sense of adversity against others. think about how much more painful it is to concede a point if your interlocutor is being really mean about it, vs if they are thoughtful and hear you out.
Consider cults (including milder things like weird “alternative” health advice groups etc.). Positivity and mutual support seem like a key element of their architecture, and adversity often primarily comes from peers rather than an outgroup. I’m not talking about isolated beliefs, content and motivations for those tend to be far more legible. A lot of belief memeplexes have either too few followers or aren’t distinct enough from all the other nonsense to be explicitly labeled as cults or ideologies, or to be organized, but you generally can’t argue their members out of alignment with the group (on relevant beliefs, considered altogether).
This is also a standard piece of anti-epistemic machinery of groups that reinforce some nonsense memplex among themselves with support and positivity. Support and positivity are great, but directing them to systematically taboo correctness-fixing activity is what I’m gesturing at, the sort of “kindness” that by its intent and nature tends to trade off against correctness.