When I say ‘our culture’, I mean modern WEIRD culture, especially the English-speaking world.
I’ve noticed this effect too and have been surprised at how viciously some people criticize altruistic endeavors. I often see this directed against the EA movement, for instance.
Since your post is oriented towards a more narrow community than WEIRD culture (Lesswrong, which has overlap with EA), I want to comment that I feel like there are additional factors at play within EA. Sometimes the goal of “trying to do the most good” implies that there’s more to gain from correcting bad beliefs about (e.g.) AI alignment risks than from correcting bad beliefs about how to pursue (self-proclaimed) selfish aims. Belief-forming processes are influenced by underlying motivations and by the self-image (or public image) we want to convey. Someone who’s conveying an altruistic image is at a bigger risk of biasing themselves and/or others and causing “view distortions.” (I say “and/or” to highlight that there’s not always a clearcut distinction between self-deception and lying – sometimes people are semi-aware that they probably don’t believe something as confidently as they make it sound, but they go on talking despite that and then others nod along and they start to believe it more. By “view distortions,” I mean confidently voicing opinions that are off in a predictable direction but also – because of the status-seeking – optimized at being persuasive to influencable members of the audience.)
So, in some contexts at least, it would be misleading to describe caring about others’ underlying motivations as “people want to police others’ motivation.” Instead, the real driver is concern for group epistemics – the motivations are just instrumentally relevant because of their link to view distortion.
That said, it’s easy/cheap to question others’ motives or wonder about various biases, and discussions often (if not to say always?) tend to get worse when they shift away from the object level. Also, as your example indicates, a culture where people frequently question others’ motivations can quickly become toxic and hostile. My take is that discussions like that are sometimes a necessary evil but it’s very hard to get the balance right.
I’ve noticed this effect too and have been surprised at how viciously some people criticize altruistic endeavors. I often see this directed against the EA movement, for instance.
Since your post is oriented towards a more narrow community than WEIRD culture (Lesswrong, which has overlap with EA), I want to comment that I feel like there are additional factors at play within EA. Sometimes the goal of “trying to do the most good” implies that there’s more to gain from correcting bad beliefs about (e.g.) AI alignment risks than from correcting bad beliefs about how to pursue (self-proclaimed) selfish aims. Belief-forming processes are influenced by underlying motivations and by the self-image (or public image) we want to convey. Someone who’s conveying an altruistic image is at a bigger risk of biasing themselves and/or others and causing “view distortions.” (I say “and/or” to highlight that there’s not always a clearcut distinction between self-deception and lying – sometimes people are semi-aware that they probably don’t believe something as confidently as they make it sound, but they go on talking despite that and then others nod along and they start to believe it more. By “view distortions,” I mean confidently voicing opinions that are off in a predictable direction but also – because of the status-seeking – optimized at being persuasive to influencable members of the audience.)
So, in some contexts at least, it would be misleading to describe caring about others’ underlying motivations as “people want to police others’ motivation.” Instead, the real driver is concern for group epistemics – the motivations are just instrumentally relevant because of their link to view distortion.
That said, it’s easy/cheap to question others’ motives or wonder about various biases, and discussions often (if not to say always?) tend to get worse when they shift away from the object level. Also, as your example indicates, a culture where people frequently question others’ motivations can quickly become toxic and hostile. My take is that discussions like that are sometimes a necessary evil but it’s very hard to get the balance right.