I’m finally managing to finish my “basic” training in rationality, which is to mean finish studying “Rationality: A-Z” (I had studied the first half years ago, but I foolishly stopped when I got to the part about reductionism, which was unbelievably stupid of me even with all the reasons the led to me doing so). I plan to continue studying even more material once I’m done with it, to train myself in instrumental rationality and everything else I can find to make myself as smart as I could possibly be. I’m very satisfied with my progresses, the first half of the sequences helped me improve tremendously years ago, and now I can see myself improving again.
But, even while I am still at what I think is just the beginning of my improvement, I’m noticing more and more a rather serious problem.
To put it politely, I hate how people think, now.
I know it’s really unfair because I didn’t know any better mere weeks ago, and years ago I was a good textbook example of an intelligent person who’d keep mainly using his intelligence to rationalise whatever questionable decisions he made, but I just can’t help it.
I notice logic leaps, cognitive missteps and dumb conclusions of people who are considered smart, deep and expert on stuff while they talk on the radio or on other medias and I get angry.
I notice idiotic ideas, as well as practices of thoughts that are the cognitive equivalent of shooting yourself in both knees, spreading inside ideologies I deeply care about, because the evils they fight are very real and demonstrated by science, but now I can see how all the truth is hopelessly getting mixed up with stuff that’s just stupid or wrong, and that the intelligent people that once introduced me to these ideologies are absolutely incapable of judging and criticising any bad idea that’s coming from their own side, and I get livid.
Half the time I hear someone talking I have to choose between politely tearing apart the majority of what he said, growing more and more annoyed, or just shutting off my attention and think about something else while pretending to listen to them.
And all this is just when I have to deal with intelligent people.
I can’t comprehend how a stupid person thinks unless I just stop thinking of him as an actual human being, switch off my empathy completely and just model him as a badly designed computer program with a bunch of floating beliefs in his memory and no analytical or critical skill whatsoever. If I try doing it the intuitive way, using empathy and telling my brain to think like him, my brain just keeps running out of suspension of disbelief as I can’t avoid thinking that, no matter how much I could believe that political party/religion/philosophy x is right, I’d still recognise that blatantly idiotic part of it as a very, very stupid idea the first time I’d heard it, since even before rationality I’ve never actually been stupid enough to believe something that even at surface level was just plain dumb, so I can’t even understand why he’s doing what he’s doing, forget predicting it.
And all this is really starting to weight on me. I think my mood has changed for the worse in the last weeks.
If you have read HPMOR, I think I’m starting to feel like professor Quirrel, and my brain has started to actually think the words “avada kedavra” when I hear something particularly stupid and hateful. I wouldn’t do that even if I could get away with it, but, emotion-wise, I have to consciously remind myself reasons why to kill someone that stupid wouldn’t just be a net positive gain for mankind and wouldn’t just spare us a waste of oxygen. The me of several years ago would have just smirked and nodded at this kind of thoughts, but I want to be smarter than the old me, and smarter than professor Quirrel as well.
I’m sorry if that was longer and more emotional than what strictly necessary, I wanted to communicate exactly how I feel and really needed to say these things to someone. I’ll try to go straight to the point now.
I think that rationality is completely worth it, I don’t regret at all studying it, I don’t want anyone to think that I regret studying it or suggest not studying it, and I will continue to move forward and improve myself. But I also think that the smart thing to do is look for ways to cheat and avoid paying this “price” as well.
So, what I want to know is:
Did other people who already learned rationality went through this as well?
If so, does it continue or eventually you just get used to other people being insane and you don’t emotionally mind it that much anymore? I can’t remember being this annoyed at people when I had read the first half of the sequences.
Do you know of or have you tried any particular strategy to not being annoyed or feel… disinterested in other people? If so, did it worked? Could you suggest any material that explains it in more details?
What do you currently do when you have to deal with the kind of problem I have described? (If your answer to this is similar to 3. you can just skip this)
Can you suggest me any material or strategy to effectively model and predict stupid people’s behaviour?
And, on a side note:
6. Can you recommend me any reading material or training you think it made you smarter or better at predicting the world or other people? I have checked some of the posts about it on this side but still thought it was worth asking. If you know of posts and lists about this, linking those would also be a huge help.
Thanks to everyone who will choose to answer this, I’ll really appreciate any help and information I can get.
Edit 04/11/2020: I stress tested some of the advice I could apply right away, by watching a 45 mins video of interviews made at a Covid-19 deniers mass protest.
I got angry about twice and got a really odd look from the person who was with me because I said out loud something about the most annoying kitten I ever saw, but I have to say my mood was a lot better than when I usually tried to just not get angry at people.
What seemed to work the most was:
Thinking about people with very bad epistemology and beliefs as the victims of a bad epistemology infective process.
Trying to understand why they believed what they believed and why they thought the way they thought. I finally managed to form predictions and make models with moving, detailed parts. Every time I noticed I was confused about why someone believed something I just kept trying until I had a model I could really understand and wasn’t just “non-sentient entities that resemble real people have been observed to exhibit stupidity number x”. It’s the first time in my life I managed to reach the level of empathy with that type of mental processes, to understand why they didn’t felt their world-view was weird rather than just remind me that people believe weird things.
This question has been really useful to me already, I expect its usefulness will shot up a lot further as I read the materials people suggested me.
I really wish to thank everyone for the excellent advice, and please do feel free to still post advice on 6. if you wish to!
Just as intelligence is orthogonal to morality, the intrinsic value of a human being is orthogonal to that human being’s intelligence. I don’t judge other people for being stupid anymore than I would judge a dog for being stupid. We are all just animals. I love dogs and people for being exactly what we are.
I went through a cynicism phase similar to what you seem to be going through. I realize, looking back, that my disdain was connected to having low status myself. These days, now that I have high status, I think of dumb people more like kittens and less like bad guys.
If you think you are smarter than other people then either you are wrong or you are right. If you are wrong then you should change your mind. If you are right then you live in an extremely inefficient world and can make a killing. The antidote to stupid words is intelligent action. If you’re not winning then you’re doing rationality wrong.
In the land of the blind, the one-eyed person is dictator. It’s good to be the dictator. If you’re not dictator then either you are blind or you do not live in the land of the blind.
Abstain from stupid media like news, Facebook and videogames.
Learn to use Anki spaced repetition software.
Teach yourself to read and write Chinese. (This is my favorite antidote for thinking you’re smarter than other people.) Then read The Art of War in its original language.
Complete a college degree in physics.
Complete a college degree in mathematics.
Learn economics, especially microeconomics.
Read all of Paul Graham’s articles.
Teach yourself computer science and machine learning.
Start a tech company.
Start a non-tech enterprise.
Get in shape by lifting weights.
Learn history. Make sure you cover at least three major civilizations (China, the Islamic World and Europe is a good place to start). This helps with perspective.
Read ethnographies on pastorialism and hunter-gatherers. Two excellent books are Arabian Sands by Wilfred Thesiger and Nisa by Marjorie Shostak. This helps you understand what people were designed for.
Learn the basics of evolutionary biology.
Acquaint yourself with the research on IQ and the Big 5 personality traits.
Take a long-distance trip with $100 in your pocket, earning the money you need to survive en route.
Teach classes.
This is… an impressive list. I really mean it.
Some items are pretty much exactly what I need for my goals, and if I had a lot of time I could try a lot more.
Sadly I need to get as smart as I can really fast. I do know a lot of things that are going in my “first century of life” list, though.
It’s funny, I got to a similar moral conclusion about an hour before reading it in your answer.
This is an extremely useful way to think about it.
I have had an insistent feeling about this for a while, but I just had vague ideas I couldn’t focus on or test. This seems an extremely good point from which to start thinking about it.
I guess it’s not really relevant, but this is the first time someone manages to describe my exact feelings about this. Thank you.
(Not that I want to literally be a dictator, I’m stating it out loud just so I don’t risk being misunderstood by someone else who hasn’t had my exact thoughts)
I had tried to idly learn Japanese as a past time, it took me around five days to realise it was just wasted time if I couldn’t dedicate some serious efforts to it. I think I was told by friends that Chinese is substantially harder. Could you give me an estimate of how much The Art of War loses when read in a good translation?
When translated into English, The Art of War loses almost as much as Romeo and Juliet loses when translated into Japanese.
If you can’t read it in Chinese then this the best translation I know of.
Which gives this person who is asking nothing. Just do what is fun for you wound be a better advice
A wise friend once said to me something like this:
“You could look at all the stuff that’s happening in the world, and all the things people are saying and doing, and be like ‘They’re all monkeys! Monkeys in suits! AAaaaagh!’ However, you could also look and say: ’Wow, look at what the monkeys built! It’s so cool that they got even this far!”
When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization. Otherwise, civilization would have happened earlier, to our hominid ancestors. We’re just monkeys that have learned some cool tricks.
The next thing to remember, of course, is that you’re a monkey too. You may be teaching yourself some cool rationality stuff, but you are still a monkey, and if you aren’t careful you’ll get arrogant/overconfident or some other such problem.
I sympathize with this bit especially. My reaction tends to be more cosmic horror than anger/frustration though. I tried to express it here.
“When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization. Otherwise, civilization would have happened earlier,”
I actually profoundly disagree with this both empirically and theoretically.
Civilizations are not some kind of natural inevitable ‘next step’ that must happen when you have a smart animal. They are a thing that CAN happen in the context of a smart animal that is capable of inventing agriculture. But there are other prerequisites.
I find the argument that complex culture is a thing that can happen in dense enough human populations, running away as it further densifies the population, persuasive. The idea is that in a low density human population ideas sometimes fail to percolate down the generations, while in a dense enough social network innovations stick down the generations more frequently because losses are less likely. It is possible that you can reach a ‘tipping point’ in a dense enough population at which point the ability to pass on new innovations allows a denser population still and further accumulation of complex culture.
There is a bit of a case study in Tasmania. The native Tasmanian population had continuity with the aboriginal Australian population before the end of the ice age, when the two landmasses were united. Ten thousand years later, upon European contact, the Aboriginal Australians maintained oral culture of events and places tens of thousands of years back, and had kept and expanded upon the toolset that existed in the united landmass… while the Tasmanians, with a smaller social network and a less dense population on that land, had lost large numbers of tools and skills including the ability to produce fire de novo (while still being able to propagate it).
I think Neanderthals are also likely evidence pointing in this direction. Their brains were more or less the same size as ours, and they had a common ancestor a full 500,000 years ago with us. But they lived in the frozen wastes of ice age Europe, in small isolated subpopulations if the homozygosity of the neanderthal paleogenomes is to be believed, with LOTS of small subpopulation bottlenecks. That’s a perfect recipe for repeatedly losing your complex material culture down the generations.
Empirically, human brain size has also been on a downtrend for the past fifteen thousand years as agriculture and civilization has spread. It is a simpler environment with fewer complex things you need to interact with on their own terms and significantly worse nutrition, so we give up some small fraction of our highly expensive intelligence over long periods of time.
Good points. I think I agree with everything you said, so I’m confused as to why we disagree. I guess your model is that we got intelligence + rationality first, and then civilization came later when we had population density, and therefore we might have more intelligence + rationality than we need to sustain civilization. The fact that brain size has been shrinking supports this; maybe we were more rational 15,000 years ago, or at least more intelligent.
I think my claim is still true though—it does seem like civilization would collapse if we got significantly dumber or less rational. I guess I had been meaning “hovering around bare minimum level” more loosely than you.
I think I concede that my argument was shaky and that we probably aren’t at the bare minimum level for reasons you mention. But I still think we are close, for a loose definition of close.
I think this will be a really helpful thought to keep in mind, thank you.
Also helpful, I think I was starting to think of myself as having done with all the basic biases.
I guess I could try to see it that way, at least I wouldn’t be angry at people who actually helped me improve back in the past.
I sometime thought about what happened to them in the terms of “That stupid way of thinking got them, it will mess them up more and more if they don’t get rid of them”.
It seems better to try seeing them as infected by a bad, harmful meme than to just get angry at them because they’re suddenly being stupid.
When a person changes their way of thinking radically, it is normal for them to want to tell everybody about them. This happens even if the change is what people here might consider irrational- think becoming religious. There’s even a Wikitionary phrase for it, “passion of a convert”.
So, the first thing I would say to your anger phase is, “Don’t worry, you’ll get over it.”
If you want to speed up getting over it, it might be useful to practice two things. The first is to really focus on personal improvement and realize you’re still a newb. The second is to deeply empathize with why other people do and believe the things they do, and realize that you were that way even a few weeks, months, years ago.
A sophomore in engineering can’t feel angry that an undecided freshmen doesn’t know calculus. A senior in aerospace engineering can’t feel angry that a senior in mechanical engineering doesn’t know anything about wing design. Who are you to get angry that a person hasn’t memorized yourbias.is when you can’t even differentiate the Many Worlds interpretation from the Copenhagen interpretation?
Everybody is still building out their map, and just because you’ve luckily found yourself on a part of elevated territory and you’re able to make a better map, doesn’t mean those with a lesser view are worse.
Secondly, it would help to read about how people come to their world views, and also specifically read about how people came to the rat-community. Basically, read people’s personal “testimonies” and you’ll find that a lot of it is driven by a mixture of personal and cultural facts. Also read testimonies of people that converted into different religions, or even the testimonies of people who didn’t convert at all.
For example, I have a Jehovah’s Witness friend. She got very close to deconversion 10 years ago to the point of listing out reasons that the JWs were wrong. Yet, last I saw on Instagram she was going to the JW headquarters and performing missionary work. Her family, her extended family, and most of her friends were all religious. Can I really be angry that her brain said, “Yeah, I’m going to believe what gives me massive amounts of comfort or am I going to believe something that could literally cause my death?”
As far as books, I would encourage reading Jonathan Haidt’s The Righteous Mind. The book attempts to look at the evolutionary background for humans’ moral systems, and is very good at injecting a large dose of empathy into its readers.
That’s a relief.
Yeah, I usually try to think like that, what I felt lately was more like… finding out that your calculus professor doesn’t actually know how to do calculus in one case, and finding that the freshmen in a scientific faculty can’t actually manage to understand simple Aristotelian logic…
Usually I get most of my annoyance from listening to supposed experts who are making evident mistakes, or from listening to people who are particularly stupid.
A really… sobering way to look at it, thank you.
I had been trying to be as smart as I could for years even before finding rationality, but finding something that good, which jumpstarted my accuracy and intelligence a lot, was sheer luck.
Also I didn’t really do anything to be born with an above average intelligence, I didn’t do anything to be grown in a home where education was highly valued, so I guess that even trying to be smarter isn’t such of an obvious idea to have.
I guess we could call it the self-made man fallacy, if you saw hard work working for you you feel like everyone else ought to just try and it would work as well for them too, but you don’t notice the strokes of luck you had or that you still had an advantage to start with.
And I knew all this stuff already, but… I don’t know, I guess I still felt as if certain things were so obvious than anyone not figuring those out wouldn’t have any excuses, since those things I always knew, so I’ve allowed my emotional response to be shaped by how this feels from the inside.
Your friend isn’t the kind of people I’d have got mad at, at least if I knew what you know about the things that trapped her into staying… which I just realised it’s the correspondence bias word for word.
If I can’t see why people are missing the obvious truth (though I don’t consider dropping religion as obvious, I know it can be pretty hard) I might just not know enough about how they learned to think or what harmful meme got their cognition before I met them or what do they think it would happen if they didn’t believed what they believe… Even pure cognitive laziness has to be caused by something, I shouldn’t just have written dumb on my model of their cognitive processes, as if dumbness was a simple mystical essence with no moving parts, I should have know better.
I’ve read about this exact mistake so many times that it’s not even funny, I had to force myself to spell it out here even if I knew it’s really a good thing that I’ve noticed and that I’m admitting it, because getting the basics wrong feels just so embarrassing.
Asking this question was extremely useful to me, it seems. I’ll check out the book, seems pretty much what I was looking for.
Reading Less Wrong made me unable to enjoy debating politics. Now the average online debate seems like a competition who is most stupid. When Facebook shows me a news article with more than 100 comments and I read a few of them, I feel dirty.
My recommended first help would be: think less about stupidity of other people, and more about your own. (Applying my lesson on myself: why am I clicking the “comments” link when I see there are more than 100 comments? And why am I even browsing Facebook in the first place?) If you are so rational, why aren’t you winning more? Yeah, some things in life depend on cooperation of others. But some other things don’t—have you already maximized those? Why not? Did you already clean up your room?
And my point here is not that if you focus on improving yourself, miracles are going to happen just because you read the Sequences. It’s just that focusing on improving yourself has a chance to lead to something useful, unlike complaining about the stupidity of others.
Most people simply don’t care about their sanity. It is a fact about your environment, deal with it. To certain degree, this is about the “near” vs “far” thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote. They survive, because they do not try to connect these two parts; it is as if they live in two completely different universes at the same time.
When you think about incentives, here is the reason: in the “near” mode you are rewarded or punished by the natural consequences of your actions; in the “far” mode you are rewarded or punished by the social consequences of your statements. This it makes sense to act reasonably in your everyday life, and spout exactly the type of crazy bullshit that gets rewarded in given social situation. On average. Sometimes following the socially approved action (using homeopathics for actual illness, or not wearing face mask in COVID-19 situation) gets you killed. But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn’t matter who was actually right.
I kinda see people on a scale, roughly separated into three groups: On one extreme, wannabe rationalists. Those are my tribe. On the other extreme, actively irrational; the kind that not only believes something crazy, but won’t shut up about it. Those I consider hopeless. But between them, and I think it might be the majority of population, is people who kinda try to do their best, sometimes impressively, sometimes their best is not very good; who have some bullshit in their heads because their environment put it there, but they are not actively promoting it, they are merely unable to clean it up; and who are able to see and listen. With those, I need to find the safe set of conversation topics, and remain there most of the time, sometimes gently probe the boundaries. There is this “agree to disagree” bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.
I never try to convert people. I explain, sometimes I nudge. If there is no reaction, I stop.
I am bad at predicting stupid people. I mean, I can vaguely predict that they will most likely “do something stupid”, but it is hard to make specific predictions. People are usually driven by emotions: they defend what they like, and attack what they dislike. They like things that make them feel good, and dislike things that make them feel bad (e.g. being told they are wrong about something). But in real-life situations, multiple forces act upon them at the same time, and I can’t predict which effect will prevail.
This is generally good advice and I do need to be more mindful of my own stupidity, but my problem isn’t that I go searching for other people stupidity so I can get angry at them, more that… I’m getting more and more annoyed every time I accidentally bump into it and I’m trying to avoid reacting by shutting off everything and everyone. Though some of the advice I’m receiving looks helpful about not doing that.
I guess that could explain the lack of critical sense they show about stuff they aren’t expert on. I’ve never cared about simply agreeing with other people ideas if they didn’t seemed right to me at first sight, and usually thought I were the one knowing best (even when deeply wrong about it) so that’s not a factor my brain considers when trying to simulate other people. Thank you for this useful insight.
I hadn’t thought of it that way. I was refusing to “agree to disagree” as if it was a moral rule, but I should stick with that if I see no chances I can actually persuade someone. To be more precise, I had figured out that between non rationalists it was often better to agree to disagree since it would be a lost cause, but I thought I just couldn’t do that, no matter who I was talking to.
I’m still a bit queasy about apparently supporting bad epistemology, so I think I’ll try to state it like “We can’t both be right, but I guess talking about it won’t lead us anywhere, so let’s just forget about it”.
Yep. Let’s not fight about it.
I would say that even among rationalists, it may be sometimes useful to settle for: “logically, at least one of us must be wrong… but finding out which one would probably be too costly, and this topic is not that important”.
Ironically I understood the “too costly” logic between rationalists pretty fast, since I’ve witnessed arguments being dissolved or hitting an objectively hard barrier to overcome really fast.
When I’m dealing with non rationalists, instead, I kinda have the impression agreement is just behind the corner.
“I understood your point of view and I have changed mine if I was doing a mistake. If we are still talking it means I figured out what mistake you are doing, why can’t you just understand what I’m saying or tell me the part you aren’t understanding, I’m doing my best to explain and I’ve been honest with you...”
That’s the sensation I usually feel when I care enough to argue about something and just don’t write the effort as hopeless from the start, but it’s just that, what I feel, it’s clearly not easy at all doing all of a sudden what I specifically trained myself to do.
This is very good.
If you are listening to an expert on the radio and similar mainstream media the expert gives you a dumbed down argument for the position he’s holding.
In an interviews you have cycles of the expert making a complex claim and then the interviewer telling them: “Can you say this in a more concise way?”
If the expert doesn’t really get it they might also be told: “Part of our audience is housewifes who never went to college who listen to our program while doing the dishes, can you make your point in a way that the housewife also understands while she does the dishes?” (This example is recounted from memory from https://media.ccc.de/v/24c3-2334-de-die_wahrheit_und_was_wirklich_passierte/related )
The argument that the same expert would make when sitting down with collegues where the expert can have an off-the-record conversation will be more nuanced and complex then the argument the expert gives on the radio.
If you hear an obviously flawed argument on the radio you shouldn’t jump to the conclusion that the expert making it is stupid but that they are just not in a position to give you the nuanced argument.
“No.”
(When talking to non-experts, most points should become less concise than when talking to other experts, because to meaningfully communicate anything to a non-expert, you also have to communicate the necessary prerequisites that other experts already know.)
It’s a valid stance to take but it’s the stance that gets the journalist to ask some other expert that’s willing to be concise. Those people you hear interviewed are generally willing to play the game of the journalists.
When being a news consumer it’s useful to not have misconceptions about what kind of information you are exposed to.
Exactly, that’s what makes the question as you formulated it funny. It’s not a question, or even a request. It’s a non-negotiable demand. If you don’t concede, the whole deal is off. Yet not conceding is often the only reasonable thing to do, so it’s a demand to be unreasonable masquerading as a question, because don’t be rude.
I hadn’t thought about this possibility.
I remember having noticed people explaining badly things I knew were actually right and had better proofs than what was being explained. If I wouldn’t know about the evidence already I wouldn’t have noticed they were misrepresenting the position, but the subjects I get angry on are rarely the kind of things where background knowledge is so complex you can’t explain it properly to a laymen.
Some of what I got angry about were just plainly stupid ideas, it doesn’t look plausible that the people talking had better reasons to sustain those and just weren’t saying them, it clearly looked as if someone was trying to be clever rather than trying to think about the evidence, and those people were the experts of their side, not the village fools.
But I did get angry at people who were quoting research and studies I hadn’t read, because, from the way they explained said studies, it was clear to me that research was plain rubbish conceived by someone who just doesn’t understand what research and evidence are.
But it’s indeed possible that the people quoting it had just made a mess and understood nothing…
I always tried to avoid being mislead because I didn’t understood passages of the two minute version of an idea, if I wasn’t understanding why they had said a thing I’d go read more on it.
I never thought that people could go as far as completely botch the two minute version they had explained to me, even when they are supposed to have studied it, but it’s clearly possible, even if not so likely.
I’ll have to remember to check the original sources when I really should get something right.
I remember a TV interview I did with a friend on Quantified Self. One of the elements was my friend measuring stress with a emWave2. In the process of dumbing down the complexity of what we were doing to make it TV compatible, my friend in the end said that he was measuring heart rate with the emWave2 to measure stress.
The thing that emWave2 actually measure is heart rate variability but there was no time to explain what heart rate variabilty is. If a viewer would actually understand the subject matter they would rightfully find it strange that my friend said he measures heartrate for stress but for the average viewer that inaccuracy wouldn’t be a big deal.
Complexity reduction like that happens when focusing on expressing oneselves in a way that works on TV and the radio.
I see, thank you for this example.
I’ll remember to prepare the dumbed down explanations in advance, in my plans I’ll have to communicate a lot in the future.
I experienced something similar with spelling mistakes for a while. The solution was to explicitly conceptualize text-on-the-page as separate from idealized-text, so that the mistakes could be imagined to be blissfully absent in the idealized text.
The issue is that when you notice a bug, there is an urge to fix it that demands satisfaction. Sometimes, there is an actual plan that fixes the bug, but intuition won’t come up with it, so deliberative thought needs to help. When fixing the bug is not on the table, it might suffice to just carefully formulate what’s known about it, perhaps writing up some notes.
For people, productive activities include charity and steelmanning: figuring out why a behavior actually happens and how to channel its purpose better.
Thanks for the link, the chewing example does feels similar to my experience, will try to think about that.
It is kind of a meme that people learn about rationality and then observe how irrational everyone else is . It is a lot easier to observe others’ irrationality than your own. But probably one’s own irrationality is more important.
1. So, work on your own irrationality first before focusing on others’ limitations.
2. As for dealing with other people’s irrationality, see (1).
3. Finally, people are going to do what they want to do. With some very rare people you can introduce them to rationality things and they might change. With most, they cannot or don’t want to be rational. This is the reality that you need to rationally deal with.
4. Also be aware that full rationality is not possible. This in the sense that you cannot do all the calculations needed to behave totally rationally. You need to employ all sorts of heuristics and short cuts. My computational capacity is limited, also my memory. Gathering data is costly. Time is short. So tolerate other people who deal with this in ways you might not prefer.
I’ve just felt how much this is true by thinking about some of the answers I got.
There really is a huge difference between just “knowing” something (I’d have knew this even before being told in these replies) and actually realising that I was making stupid mistakes in how I thought about this very subject.
I would have agreed to point 1. and 2. right away, and I wrote this question with 3. firmly in mind, so I thought I was being really rational about all this issue, since I actually knew I was just supposed to search for a way to solve the way I felt about it and not magically expecting people to change overnight, and I still had overlooked a mistake I was making about how I thought about “dumb people” that was causing most of my negative feelings.
About 4, heuristics, shortcuts and not wanting to think about something were all thing I understood and tolerated in other people.
I felt angry when
1) facing the sheer, total lack of judgement that some people show in areas where I felt they should at least try to have some, and
2) facing the more questionable approaches to finding truth that supposedly smart experts use while talking about stuff they are thought to know. The kind of stuff you find in magic theories and psychoanalysis, only that apparently it has been creeping into all types of humanistic modern fields, and when before I could just vaguely recognise that something was just wrong with how they reached a conclusion, as soon as I finished reading about reductionism and all useful parts of cognition having to be Bayesian at some level, I could suddenly give that a name, and understand exactly what they were doing wrong and put it into words, so I got suddenly a lot more annoyed at them, as if it was the mistake that had just gotten dumber rather than me getting smarter.
I like how much your answer bears resemblance to advice on other subjects unrelated to rationality.
Related: The treacherous path to rationality
Maybe your question is addressed by this part:
I was really puzzled reading that post, to me learning rationality always felt wonderful, my first round with it was like I had suddenly noticed I was living in a really small cage inside my head, and now I could suddenly open the door to get out and walk outside on my legs for the first time and then run. Now that I’m finally managing to continue I feel like the rest of the world just gets clearer and clearer to understand, even if I got these negative emotions as side effects.
I can only assume I was the ideal subject to learn it, when I stumbled into it I was managing to self sabotage myself at everything relevant I tried to do in an obstinate attempt to not risk gaining any possible disconfirmation about my intelligence.
I had wrote more about it, but then I realised I should just write a coming of age or postmortem about this.
Back to the subject:
I guess this kinda describes what happened to me, it wasn’t exactly a perilous path but I did put in a lot of work.
I’m really unsure about how I could try to integrate my intuitions into my explicit reason, at first sight they seem like incompatible processes since you can’t really understand why you are having a particular intuition (if the post uses intuitions to mean the kind of ideas of judgements you can’t explain at an explicit level).
Or the suggestion is to apply explicit reason to check if the initial suggestions my intuitions give me make sense?
So far I haven’t managed to intentionally use intuition to solve a single relevant problem, I think my mind mostly uses intuition when I don’t have the time to make all the calls by explicit reason or by pre-selecting for good ideas and pointing out possible mistakes I then examine.
All in all, I don’t think I trust my intuitions much because explicit reason improved my performances a lot and I feel very nervous about going with something I can’t make sense of.
If anyone has thoughts on this or suggestions I’d love to hear them. The other mental faculties mentioned seem easier to integrate with explicit reason.
I know, I wrote this question also because I didn’t wanted to risk feeling angry or disinterested toward my friends. Even if I know they are relatively “crazy” I don’t feel at all like I shouldn’t be friend with them anymore…
I guess it would be a good idea to remind myself to notice and appreciate what I like about them and the warm things they do, even if they aren’t at all related to being smart or rational.
Intuition is distilled deliberation. Deliberation is a sequence of intuitive steps, amplified intuition. A given intuition is formed by (and stands for) the dataset that trains it, the habits of deliberative thought on its specific topic.
I didn’t intend to imply that learning rationality can feel difficult or hard. It sure didn’t for me as my path started early and I had a lot of support. But I guess it can be challenging in some circumstances.
I understand, what I meant was that I initially felt confused reading the post you linked, since that one did implied that a lot of people do.
But having thought about it, it seems likely that a lot of people would find themselves in those challenging circumstances.
-
I’m all for having an accurate map, and that does mean updating that map. But don’t let that stop you from trying to alter the territory—and actually fixing problems.
If the world fails to meet your expectations, sometimes the problem is with the world.
-
I strongly disagree (about “by definition”; it’s of course a popular sense of the word). Operationalization of caring is value, preference. It’s channeled by decision making, and deliberative thought is capable of taking over decision making. As such, it may pursue an arbitrary purpose that a person can imagine. A purpose not derived from emotion in any way might be thought to be an incorrect idealization of preference, but even a preference ultimately grounded in emotion will be expressed by decisions that emotions are occasionally incapable of keeping up with.
-
Yes, that’s exactly how I was looking at it, though I guess I didn’t made a very good job at explaining that in my question.
I mean, I still think the current lack of rationality in the world is a big problem, but it’s not like I expect people to do better any time soon, I was just looking for ways to avoid feeling like I feel when I’m reminded of that.
I’ll look into DBT and try your advice, thanks.
-
Maybe it would help if you realized that most people most of the time are not interested in being explicitly rational. They’re focused on something else: often they’re focused on building relationships, or getting a task done, or enjoying themselves. Maybe you could try focusing on those things too, especially the relationship-building bit, instead of choosing between “tearing apart” or ignoring what they say.
Also, I don’t know how old you are, but I’ve noticed that the people I interact with have gotten more congenial over time. As a child/teen/college student, many of my interactions were with nonchosen family or classmates. Now most of my interactions are with chosen family, friends, or workmates filtered to be more like me.
Oh, and since you mention being annoyed by “experts” on the radio, maybe...don’t listen to the radio or other media. You probably don’t need to do that, you’re not getting any relationship-building benefits out of it, and it’s annoying you.
I’m afraid that’s the main reason I’m getting angry at them, the utter lack of trying at being intelligent when they have to choose what to do or believe.
I never get angry at people for enjoying something stupid, or felt like they should treat each other are robots, or because they just follow (non evil) instructions, that I can understand.
I get angry only when it involves something where they really, really should try to get it right, and they don’t even have excuses like being under stress or pressure, and they still don’t try even just a little.
Though, now that I spell it out in more details, I realise (well, more remember, I already knew that) that people could focus on those other aspects you mention even when they shouldn’t.
This was helpful, I hadn’t noticed that I needed a more complex model of “being dumb” both to model people and to not get angry at them.
Unfortunately I had already selected what I’m exposed to as relations and media as much as I could a while ago, it wouldn’t be easy to make a second selection.
One aspect of intelligence/rationality is estimating the productiveness of a conversation before it happens. Another is expressing your views in a way that sounds palatable even to those who disagree. Another is recognizing that on any given topic there are more knowledgeable people than you, and seeking them out. Another is directing most of your effort and emotion toward things you can influence. You can’t learn these things from a book though, you have to practice them.
I think I’m doing more or less okay in most of these (still room for a lot of improvement of course), my problem seems to be focused around:
1)
I can only do this if I understand how someone thinks, and I have to get a better model of how people I usually wouldn’t really want to talk to think. (I need for my goals to be able to influence those kind of people as well)
2)
I’m pretty good at avoiding wasted efforts, not good at all at directing my emotions toward things that I can actually influence.
I’d say books make a stronger starting point for practice, but it hadn’t occurred me I could just try and choose to not get angry. I’ve managed to regulate my emotions before this way, though I never tried on quick emotions...
I’ve gotten advice that should make not feeling angry easier, but will also give practice a shot.
I should really make a habit to focus my emotions and attention toward stuff that actually matters, it screwed me up in a lot of ways already. Thanks.
“I know it’s really unfair because I didn’t know any better mere weeks ago, and years ago I was a good textbook example of an intelligent person who’d keep mainly using his intelligence to rationalize whatever questionable decisions he made, but I just can’t help it.”
If we approach this from an economists lens the situation seems to change slightly. To an economist, a rational actor is someone or something that acts in her own self interest. Acting in ones own self interest is to make decisions where the foreseen benefits outweigh the foreseen costs. This means that even someone who is addicted to a hard drug and continues to use that drug is acting rationally as long as the benefits of continuing to use said hard drug outweigh the costs for that particular person.
Societally the values may not align but that doesn’t mean that they are irrational. It just means that our foreseen benefits and costs are different from theirs. If you want to go down that rabbit whole, look into behavioral economics. They like to claim that people can act irrationally.
I understand what you mean, but, under these lens, I’d be using “irrational” to describe thought processes that negatively affect attempts to estimate the foreseen benefits and costs of a decision, or that cause people to connect their foreseen benefits with actions that have no real reason to lead to those.
Also, the way the word is used on this site, “rationality” is also the art of managing to not have your short term benefits get in the way of the real long term benefits you’d rather choose, and in choosing which foreseen benefits and costs should matter most to you.
So the addict would be irrational if in his decision he considered only: “pleasure from drug + 10 utility”, “temporary pain from stopping − 50 utility”, rather than also “likelihood of slow disintegration of my life, 20% of −1000 utility”, and “slowly decreasing effects from the drug and likelihood of having increasing future difficulties in obtaining drug, −80% of future “pleasure from drug +10 utility”, because a) he’s not considering at all the third factor because it’s not certain, or b) he’s considering only the first two because they are temporally closer, or d) he thinks what has seen for every drug addict he knows that’s further in the temporal curve won’t hold true for him without having a good reason to think so. (this is not a good model of drug addiction I think, I was just trying to describe what I mean).
Still, this is not the kind of irrationality I was getting mad at, I think I was getting mad mostly at irrational decisions and thoughts that didn’t had evident “strong causes”, addiction and social pressure are strong causes.
Going down that rabbit hole was in my plans, I’ll check out behavioural economics.
People are not only not rational, most do not WANT to be rational, and value many other things higher than rationality. Remember the story arcs of characters like Spock, Data, and Sheldon do not celebrate their becoming more rational.
That’s an interesting thought, I was aware that most fiction kept saying to people that Kirk beats Spock, I hadn’t noticed that even the character arcs of rational or smart characters are almost never about them getting smarter or improving their minds...
I think I saw that in a very few manga, even those that have a genius main character doesn’t think about him getting more rational or smarter at all, he’s just a genius from start to finish, any progress he makes are usual on other sides of himself...
It seems this is really something that’s lacking from our culture.
You may be confused by some of my response. I’m well aware it deviates substantially from your inquiry—there is just substantial back-end stuff I think would help your autonomy to more efficiently improve in anything.
In Eliezer’s “12 Virtues of Rationality”, read the last virtue—the nameless virtue of the void. Take what follows as a guide to approach what he writes.
You appear to be approaching these problems with a vague mainframe—possibly even rationality as a whole with a vague superframe. When you ask for advice and sources to help, you think you want the subframes, which will fit on your vague mainframe. While it will correlate to better decisions and will eventually lead to a clear mainframe, it will not nail them as efficiently or as expansively than could be accomplished if you were to deliberate it the other way around (recall the effects skimming a book before reading, or defining the purpose before action, versus reading the book before skimming or acting without purpose).
To devise a mainframe, though, you do need some knowledge both about how to best make a schema and general knowledge about your area of improvement. Very quickly, you will find yourself scaffolding a formalization of the outer-boundaries of what you and rationality currently knows.
This principle can be applied to learning efficiency, rationality, or anything cognitive. This is how the mind works most naturally. This is what top thinkers are actually doing; it is how some people see the world clearer than others. This is how you prevent yourself from creating sub-optimal circumstances from within your own confusion and ignorance. This is not clearly widespread, and much less so brought to application. There are tools and decisions that arise from it.
If you do not have a clear and accurate model on which to assess yourself, you cannot expect to understand the beat of a situation, will not respond in the best way pragmatically possible, and your improvement will be drastically slower. You may be guessing about what exactly constitutes your insufficiency and thus not target your limiting attributes as well.
This is to aid you in constructing a proper mainframe for your specific inquiry:
When you feel emotional tension, there is two options: you can change yourself or you can change others. Pragmatically, you cannot often change others. It is the job of your short-term advocate to choose, and it is the job of your long-term advocate to make the prior knowledge required to assess if it can (or should) be done.
With tension, there is some underlying value you are predisposed to assume. You can change this emotional tension from within the experience by changing your lens from which you are viewing it. Or, you can train the predisposition, which is to internalize general features of the desirable type of lens-changes.
Both are indispensable for a bounded rationalist. Training the predisposition means you can make better decisions across more instances, quicker, and with less cognitive effort. And being able to change your lens real-time is a good patch where your predisposition is insufficient. This autonomy can be defined as a controller of predispositions.
You do not want to eradicate emotional tension, you merely want to get rid of the unhelpful tension. Tension within can be extremely useful because it necessitates thought and behaviors to occur. We just want those thoughts and behaviors to be aligned to wider knowledge and purpose. My wider purpose through my bottle-necked knowledge, in short, is to minimize human suffering while maximizing sustainability.
Don’t let these simple words fool you—there is a great complexity to what they actually mean and how they may be applied. Abstract thinking applied seems to be the foundation for all decision-making; this is what rationality is in thought and action. Abstractness prevents details, thus inherently coming out more correct. After practice and targeted training can one refine his abstractions down to subsets of abstractions, and further still.
I recommend these two as the strongest sources that have brought me to the above propositions.
ICanStudy (“chunkmapping” is what they call the efficient frame-making. I cannot think of a more efficient and pragmatic way to organize a schema. Principles: Video 1, Video 2.)
and Jordan Peterson’s lecture series 2017 Personality and its Transformations.