This kind of thinking actively drives me and many others I know away from LW/EA/Rationality
And that kind of thinking (appeal to the consequence of repelling this-and-such kind of person away from some alleged “community”) has been actively driving me away. I wonder if there’s some way to get people to stop ontologizing “the community” and thereby reduce the perceived need to fight for control of the “LW”/”EA”/”rationalist” brand names? (I need to figure out how to stop ontologizing, because I’m exhausted from fighting.) Insofar as “rationality” is a thing, it’s something that Luke-like optimization processes and Zvi-like optimization processes are trying to approximate, not something they’re trying to fight over.
And that kind of thinking (appeal to the consequence of repelling this-and-such kind of person away from some alleged “community”) has been actively driving me away. I wonder if there’s some way to get people to stop ontologizing “the community” and thereby reduce the perceived need to fight for control of the “LW”/”EA”/”rationalist” brand names? (I need to figure out how to stop ontologizing, because I’m exhausted from fighting.) Insofar as “rationality” is a thing, it’s something that Luke-like optimization processes and Zvi-like optimization processes are trying to approximate, not something they’re trying to fight over.
As usual, this makes me wish for UberFact or some other way of tracking opinion clusters.