My actual honest reaction to this sort of thing: Please, please stop. This kind of thinking actively drives me and many others I know away from LW/EA/Rationality. I see it strongly as asking the wrong questions with the wrong moral frameworks, and using it to justify abominable conclusions and priorities, and ultimately the betrayal of humanity itself—even if people who talk like this don’t write the last line of their arguments, it’s not like the rest of us don’t notice it. I don’t have any idea what to say to someone who writes ‘if I was told one pig was more important morally than one human I would not be surprised.’
That’s not me trying to convince anyone of anything beyond that I have that reaction to this sort of thing, and that it seemed wrong for me not to say it given I’m writing reviews. No demon threads please, if I figure out how to say this in a way that would be convincing and actually explain, I’ll try and do that. This is not that attempt.
This kind of thinking actively drives me and many others I know away from LW/EA/Rationality
And that kind of thinking (appeal to the consequence of repelling this-and-such kind of person away from some alleged “community”) has been actively driving me away. I wonder if there’s some way to get people to stop ontologizing “the community” and thereby reduce the perceived need to fight for control of the “LW”/”EA”/”rationalist” brand names? (I need to figure out how to stop ontologizing, because I’m exhausted from fighting.) Insofar as “rationality” is a thing, it’s something that Luke-like optimization processes and Zvi-like optimization processes are trying to approximate, not something they’re trying to fight over.
I see it strongly as asking the wrong questions with the wrong moral frameworks, and using it to justify abominable conclusions and priorities, and ultimately the betrayal of humanity itself—even if people who talk like this don’t write the last line of their arguments, it’s not like the rest of us don’t notice it. I don’t have any idea what to say to someone who writes ‘if I was told one pig was more important morally than one human I would not be surprised.’
My actual honest reaction to this sort of thing: Please, please stop. This kind of thinking actively drives me and many others I know away from LW/EA/Rationality. I see it strongly as asking the wrong questions with the wrong moral frameworks, and using it to justify abominable conclusions and priorities, and ultimately the betrayal of humanity itself—even if people who talk like this don’t write the last line of their arguments, it’s not like the rest of us don’t notice it. I don’t have any idea what to say to someone who writes ‘if I was told one pig was more important morally than one human I would not be surprised.’
That’s not me trying to convince anyone of anything beyond that I have that reaction to this sort of thing, and that it seemed wrong for me not to say it given I’m writing reviews. No demon threads please, if I figure out how to say this in a way that would be convincing and actually explain, I’ll try and do that. This is not that attempt.
And that kind of thinking (appeal to the consequence of repelling this-and-such kind of person away from some alleged “community”) has been actively driving me away. I wonder if there’s some way to get people to stop ontologizing “the community” and thereby reduce the perceived need to fight for control of the “LW”/”EA”/”rationalist” brand names? (I need to figure out how to stop ontologizing, because I’m exhausted from fighting.) Insofar as “rationality” is a thing, it’s something that Luke-like optimization processes and Zvi-like optimization processes are trying to approximate, not something they’re trying to fight over.
As usual, this makes me wish for UberFact or some other way of tracking opinion clusters.
Entirely seconded; this is my reaction also.