From my perspective, the simplest and best explanation by far is the Hansonian one: it’s just bad signaling to talk about caring less about things as opposed to more. You can’t form a social group / movement / alliance around it, it makes you look heartless, etc.
(It’s perfectly fine to pair caring less about a thing with caring more about another thing, e.g. the parts of the “manosphere” that combine caring more about having sex with women and caring less about feminism.)
The good news for rationalists is that, as a corollary, there’s money on the ground to be picked up deliberately caring less about things, especially things it would be bad signaling to publicly advocate for caring less about. Some plausible candidates off the top of my head:
Recycling
The news (except these days there’s mission-critical stuff like possible increased threat of nuclear war)
I think you’re right, so I’ll start by assuming that you’re wrong, because I have an alternative explanation for those who disagree with you (and which I think is the most convincing if we assume that the signalling explanation isn’t the correct one). I think Eukaryote is missing one important cause. Assume that most writers arguing for caring more or less about a cause are doing so because they believe that this is an important way to serve that cause. Particularly outside our community, people rarely write about causes just for intellectual entertainment. “Everything is signalling” is a valid response, but I’ll first reply to the rational-actor case, since that informs the limits and types of signalling. If I am writing to people who are “broadly value aligned”, an admittedly imprecise term, I tend to expect that they are not opposed to me on topics that I think are most important. I expect most (85% if I’m optimistic) instances of reading writing to happen when people who are broadly value aligned with the author, at least with respect to the topic of the piece.
If someone cares less about something, I might value that directly (because I dislike the results of them caring), and I might value that indirectly (because I expect effort they take away from the target of my writing to go towards other causes that I value). However, conditional on broad value alignment, the causes that my readers care passionately about are not causes I’m opposed to, and the causes that I care passionately about are not ones that they’re opposed to. So direct benefit, except in writing that is explicitly trying to convince people “from the other side”, will rarely motivate me to try to make people care less.
Most communities have more than 3-4 potential cause areas. One specific friend of mine will physically go to events to support gun control, gender equality, fighting racism, homelessness prevention, Palestine, Pride, her church, abortion rights, and other topics. If I make her be less confident that gun control is an effective way of reducing violence, her efforts will be split fairly broadly. It is unlikely that whatever topic I find most important, or even whatever bundle of topics I find most important, is going to receive much marginal support. EAs are relatively unusual in that deactivating someone along one cause area has a high expected affect on specific other cause areas.
From my perspective, the simplest and best explanation by far is the Hansonian one: it’s just bad signaling to talk about caring less about things as opposed to more. You can’t form a social group / movement / alliance around it, it makes you look heartless, etc.
(It’s perfectly fine to pair caring less about a thing with caring more about another thing, e.g. the parts of the “manosphere” that combine caring more about having sex with women and caring less about feminism.)
The good news for rationalists is that, as a corollary, there’s money on the ground to be picked up deliberately caring less about things, especially things it would be bad signaling to publicly advocate for caring less about. Some plausible candidates off the top of my head:
Recycling
The news (except these days there’s mission-critical stuff like possible increased threat of nuclear war)
Formal education, maybe
Offending people
Infrequent tragic events that kill small numbers of people
Answering emails
I think you’re right, so I’ll start by assuming that you’re wrong, because I have an alternative explanation for those who disagree with you (and which I think is the most convincing if we assume that the signalling explanation isn’t the correct one). I think Eukaryote is missing one important cause. Assume that most writers arguing for caring more or less about a cause are doing so because they believe that this is an important way to serve that cause. Particularly outside our community, people rarely write about causes just for intellectual entertainment. “Everything is signalling” is a valid response, but I’ll first reply to the rational-actor case, since that informs the limits and types of signalling. If I am writing to people who are “broadly value aligned”, an admittedly imprecise term, I tend to expect that they are not opposed to me on topics that I think are most important. I expect most (85% if I’m optimistic) instances of reading writing to happen when people who are broadly value aligned with the author, at least with respect to the topic of the piece.
If someone cares less about something, I might value that directly (because I dislike the results of them caring), and I might value that indirectly (because I expect effort they take away from the target of my writing to go towards other causes that I value). However, conditional on broad value alignment, the causes that my readers care passionately about are not causes I’m opposed to, and the causes that I care passionately about are not ones that they’re opposed to. So direct benefit, except in writing that is explicitly trying to convince people “from the other side”, will rarely motivate me to try to make people care less.
Most communities have more than 3-4 potential cause areas. One specific friend of mine will physically go to events to support gun control, gender equality, fighting racism, homelessness prevention, Palestine, Pride, her church, abortion rights, and other topics. If I make her be less confident that gun control is an effective way of reducing violence, her efforts will be split fairly broadly. It is unlikely that whatever topic I find most important, or even whatever bundle of topics I find most important, is going to receive much marginal support. EAs are relatively unusual in that deactivating someone along one cause area has a high expected affect on specific other cause areas.
How much you invest in your kids
Education