Very interesting post. This is a topic similar to one i think about often. Several things: (I only read a couple of the comments, maybe missed something important.)
1. It’s possible to say “yeah, a lot of people are irrational and don’t care about truth, that is bad and disappointing”. You make it part of your world model used for predictions (in fact you probably should). This change to the world model might predict that some plans/actions/rules you thought are good actually aren’t, because you can’t rely on people to act rationally. But that itself doesn’t imply that people are bad, or that you “should” feel contempt towards them (i think no one should feel contempt at all, it’s not productive, but my opinion is not very relevant), or that you should care less about their well being (i believe every creature capable of feeling happiness and suffering should experience happiness and not suffering, but again my opinion is irrelevant).
2. A bunch of people are bad at stuff. I am also bad at stuff. I understand incredibly important things about the universe, but not all of them. There are a bunch of important things i don’t know, or true things i don’t believe. Someone might look at me and think “look at this guy, cares so much about having epistemological justifications for everything, instead of giving money to charity, meditating, not eating meat, and learning physics”, or something like that. And maybe they would be right. Maybe despite my commitment to truth and epistemology and coherent models of the world, i am just “an NPC to manipulate for fun” for someone (because, really, what trait is it that separates those hypothetical NPCs from “real people”? How common it is? Can it be acquired or lost? It probably doesn’t exist in the Territory). I should not look down on them (even if they are irrational in many ways i consider important), and they shouldn’t look down on me, but simply objectively evaluate traits we have and don’t have. We are all humans, humans run on bad software, we are trying what we can despite that limitation. See point 1. Maybe you don’t value truth as much as i do (hypothetical claim, probably wrong, but how would i know, or you know, with no further information?). Should i treat you worse or feel negative emotions towards you? I think not. In a sense, “looking down on people” can itself be seen as an irrational failing, and so i would look down on people who are as rational as me in other things, but not that one (except, recursively, i wouldn’t, because if i looked down i wouldn’t be better than the people i would look down on). (An interesting fact is that i did, in fact, look down on people for not getting things i get, before i found the rationalist community. This is one of those irrational habits i actively unlearned. Though the exact path to unlearning it was kind of complicated.) This can be generalized to something like “i want to have the same decision algorithm for evaluating other people that they should have for evaluating me, which is robust to differences in specific values”. (But gets into really weird anthropic and/or Hinduist arguments like “could have i been born with different personality and still be me; could another person have been born as me and me as them”.)
3. “I thought about my own positionality and luck”. This is central, and very salient to me. I was not always rational. I didn’t always care about truth (i only started caring about truth after i became obsessed with being a moral person, which itself only happened long after i stopped wanting to live. Long story). It’s better to be rational than not, but i can’t expect that of people by default. Rationality is a skill, that people can learn. I am trying to teach people what i can, because i assume without that they just wouldn’t know better. Raising the Sanity Waterline. “Like being offered a hand up, so that you can help the person behind you who’s still struggling in turn” is a recent fictional quote i love in this context (though it referred to a different thing originally). Because people live in epistemological hell. And they have no way to get out. Or rather, they have 40 different sources telling them at all times how to get out of the hell, most of which are wrong because their conceptualizing of the hell is wrong (but maybe mine is wrong, too, i am wrong about some of those being wrong). Some of them do not want to learn, and do not want to know the truth, but again, i once didn’t want to know the truth, and those people maybe would have, in different circumstances. This doesn’t make the situation better, but it’s not their fault or a sign of their moral failure (to the extent moral failure is even a thing). Some people hate the rationalist movement/community, and are opposed to learning anything it teaches. Which is bad, and wrong, but it’s a wrong position they arrived at through reasonable pathways (I am in the process of writing a Sequence about that, among other things). At the same time, just because someone is a rationalist, and values truth, and knows what cognitive biases are, and separates observations from inferences, and plans for the least convenient world...doesn’t mean they don’t believe something very important which is false and harmful, or not believe it but make unendorsed mistakes while not noticing their beliefs. See point 2.
Ultimately, it’s your right to like and dislike whoever you want, and prefer any company you want. But i don’t think you need to sacrifice your sense of egalitarianism for your intellectual integrity, because there isn’t factual observation that equals “a lot of people don’t deserve good things because they are not intelligent/rational enough”. (Unless you specifically mean support for specific policies and laws. Which, yes, ideal laws probably should account for people being irrational. Though at the point where laws are ideal, probably teaching people better thinking, or just allowing them to learn it better in the better environment, would make those differences in laws superfluous. And in the current situation, most such changes would probably have bad result in expectation. And you probably didn’t mean that specific thing anyway, i am only writing it for completeness.)
“when i am around people i find intellectually unserious, i deny them personhood and i act in an incredibly shitty way. my worldview says this is bad and i am sure you agree; my nervous system becomes suffused with hatred and does it anyway.” The problem seems to be mostly emotional, and so should be treated as all emotional problems of that sort. There is a fact, the fact makes you feel a negative emotion. What i do in such situations is look at the fact, and the emotion, examine them closely, look at the connection, and find the metaphorical very small arrow saying “this fact should make you feel the emotion” and ask “but why? Why should the emotion be caused by the fact? What purposes does it serve? Which things become better by me feeling that emotion? Do i want to want to feel this?” until it hopefully goes away. Or sometimes it doesn’t, and then i keep living with that emotion. Don’t know what’s it like for other people.
Very interesting post. This is a topic similar to one i think about often. Several things:
(I only read a couple of the comments, maybe missed something important.)
1. It’s possible to say “yeah, a lot of people are irrational and don’t care about truth, that is bad and disappointing”. You make it part of your world model used for predictions (in fact you probably should). This change to the world model might predict that some plans/actions/rules you thought are good actually aren’t, because you can’t rely on people to act rationally. But that itself doesn’t imply that people are bad, or that you “should” feel contempt towards them (i think no one should feel contempt at all, it’s not productive, but my opinion is not very relevant), or that you should care less about their well being (i believe every creature capable of feeling happiness and suffering should experience happiness and not suffering, but again my opinion is irrelevant).
2. A bunch of people are bad at stuff. I am also bad at stuff. I understand incredibly important things about the universe, but not all of them. There are a bunch of important things i don’t know, or true things i don’t believe. Someone might look at me and think “look at this guy, cares so much about having epistemological justifications for everything, instead of giving money to charity, meditating, not eating meat, and learning physics”, or something like that.
And maybe they would be right. Maybe despite my commitment to truth and epistemology and coherent models of the world, i am just “an NPC to manipulate for fun” for someone (because, really, what trait is it that separates those hypothetical NPCs from “real people”? How common it is? Can it be acquired or lost? It probably doesn’t exist in the Territory).
I should not look down on them (even if they are irrational in many ways i consider important), and they shouldn’t look down on me, but simply objectively evaluate traits we have and don’t have.
We are all humans, humans run on bad software, we are trying what we can despite that limitation. See point 1.
Maybe you don’t value truth as much as i do (hypothetical claim, probably wrong, but how would i know, or you know, with no further information?). Should i treat you worse or feel negative emotions towards you? I think not.
In a sense, “looking down on people” can itself be seen as an irrational failing, and so i would look down on people who are as rational as me in other things, but not that one (except, recursively, i wouldn’t, because if i looked down i wouldn’t be better than the people i would look down on).
(An interesting fact is that i did, in fact, look down on people for not getting things i get, before i found the rationalist community. This is one of those irrational habits i actively unlearned. Though the exact path to unlearning it was kind of complicated.)
This can be generalized to something like “i want to have the same decision algorithm for evaluating other people that they should have for evaluating me, which is robust to differences in specific values”. (But gets into really weird anthropic and/or Hinduist arguments like “could have i been born with different personality and still be me; could another person have been born as me and me as them”.)
3. “I thought about my own positionality and luck”. This is central, and very salient to me. I was not always rational. I didn’t always care about truth (i only started caring about truth after i became obsessed with being a moral person, which itself only happened long after i stopped wanting to live. Long story). It’s better to be rational than not, but i can’t expect that of people by default. Rationality is a skill, that people can learn. I am trying to teach people what i can, because i assume without that they just wouldn’t know better. Raising the Sanity Waterline. “Like being offered a hand up, so that you can help the person behind you who’s still struggling in turn” is a recent fictional quote i love in this context (though it referred to a different thing originally).
Because people live in epistemological hell. And they have no way to get out. Or rather, they have 40 different sources telling them at all times how to get out of the hell, most of which are wrong because their conceptualizing of the hell is wrong (but maybe mine is wrong, too, i am wrong about some of those being wrong).
Some of them do not want to learn, and do not want to know the truth, but again, i once didn’t want to know the truth, and those people maybe would have, in different circumstances. This doesn’t make the situation better, but it’s not their fault or a sign of their moral failure (to the extent moral failure is even a thing). Some people hate the rationalist movement/community, and are opposed to learning anything it teaches. Which is bad, and wrong, but it’s a wrong position they arrived at through reasonable pathways (I am in the process of writing a Sequence about that, among other things). At the same time, just because someone is a rationalist, and values truth, and knows what cognitive biases are, and separates observations from inferences, and plans for the least convenient world...doesn’t mean they don’t believe something very important which is false and harmful, or not believe it but make unendorsed mistakes while not noticing their beliefs. See point 2.
Ultimately, it’s your right to like and dislike whoever you want, and prefer any company you want. But i don’t think you need to sacrifice your sense of egalitarianism for your intellectual integrity, because there isn’t factual observation that equals “a lot of people don’t deserve good things because they are not intelligent/rational enough”.
(Unless you specifically mean support for specific policies and laws. Which, yes, ideal laws probably should account for people being irrational. Though at the point where laws are ideal, probably teaching people better thinking, or just allowing them to learn it better in the better environment, would make those differences in laws superfluous. And in the current situation, most such changes would probably have bad result in expectation. And you probably didn’t mean that specific thing anyway, i am only writing it for completeness.)
“when i am around people i find intellectually unserious, i deny them personhood and i act in an incredibly shitty way. my worldview says this is bad and i am sure you agree; my nervous system becomes suffused with hatred and does it anyway.”
The problem seems to be mostly emotional, and so should be treated as all emotional problems of that sort. There is a fact, the fact makes you feel a negative emotion.
What i do in such situations is look at the fact, and the emotion, examine them closely, look at the connection, and find the metaphorical very small arrow saying “this fact should make you feel the emotion” and ask “but why? Why should the emotion be caused by the fact? What purposes does it serve? Which things become better by me feeling that emotion? Do i want to want to feel this?” until it hopefully goes away. Or sometimes it doesn’t, and then i keep living with that emotion. Don’t know what’s it like for other people.