There is an idea that I’ve sometimes heard around rationalist and EA circles, that goes something like “you shouldn’t ever feel safe, because nobody is actually ever safe”.
Wait, really?! If this is true then I had severely overestimated the sanity minimum of rationalists. The objections in your post are all true, of course, but they should also pop out in a sane person’s mind within like 15 seconds of actually hearing that statement...
It’s the kind of thought that one might have if they have a (possibly low-grade) anxiety issue: you feel anxious and like the world isn’t safe and you need to be alert all the time, so then your mind takes that observation as an axiom and generates intellectual reasoning to justify it. And I think there’s a subset of rationalists who were driven to rationality because they were anxious; Eliezer even has an old post suggesting that in order to be really dedicated to rationality, you need to have undergone trauma that broke your basic trust in people:
Of the people I know who are reaching upward as rationalists, who volunteer information about their childhoods, there is a surprising tendency to hear things like: “My family joined a cult and I had to break out,” or “One of my parents was clinically insane and I had to learn to filter out reality from their madness.”
My own experience with growing up in an Orthodox Jewish family seems tame by comparison… but it accomplished the same outcome: It broke my core emotional trust in the sanity of the people around me.
Until this core emotional trust is broken, you don’t start growing as a rationalist. I have trouble putting into words why this is so. Maybe any unusual skills you acquire—anything that makes you unusually rational—requires you to zig when other people zag. Maybe that’s just too scary, if the world still seems like a sane place unto you.
Or maybe you don’t bother putting in the hard work to be extra bonus sane, if normality doesn’t scare the hell out of you.
In retrospect, it’s too surprising that people might get anxious and maladaptive thought patterns if “normality scares the hell out of” them.
Re “they should also pop out in a sane person’s mind within like 15 seconds of actually hearing that statement” I agree with that in the abstract; few people will say that a state of high physiological alertness/vigilance is Actually A Good Idea to cultivate for threats/risks not usefully countered by the effects of high physiological alertness.
Being able to reason about that in the abstract doesn’t necessarily transfer to actually stopping doing that. Like personally, I feel like being told something along the line of “you’re working yourself up into a counterproductive state of high physiological alertness about the risks of [risk] and counterproductively countering that with incredibly abstract thought disconnected from useful action” is not something I am very good at hearing from most people when I am in that sort of extraordinarily afraid state. It can really feel like someone wants to manipulate me into thinking that [risk] is not a big deal, or discourage me from doing anything about [risk], or that they’re seeking to make me more vulnerable to [risk]. These days this is rarely the case; but the heuristic still sticks around. Maybe I should find its commanding officer so it can be told by someone it trusts that it’s okay to stand down...
With the military analogy; it’s like you’d been asked to keep an eye out for a potential threat, and your commanding officer tells you on the radio to get on REDCON 1. Later on you hear an unfamiliar voice on the radio which doesn’t authenticate itself, and it keeps telling you that your heightened alertness is actually counterproductive and that you should stand down.
Would you stand down? No, you’d be incredibly suspicious! Interfering with the enemy’s communication is carte blanche in war. Are there situations where you would indeed obey the order from the unfamiliar voice? Perhaps! Maybe your commanding officer’s vehicle got destroyed, or more prosaically, maybe his radio died. But it would have to be in a situation where you’re confident it represents legitimate military authority. It would be a high bar to clear, since if you do stand down and it was an enemy ruse, you’re in a very bad situation regardless if you get captured by the enemy or if you get court-martialed for disobeying orders. If it seems like standing down makes zero tactical/strategic sense, your threshold would be even higher! In the extreme, nothing short of your commanding officer showing up in person would be enough.
All of this is totally consistent with the quoted section in OP that mentions “Goals and motivational weightings change”, “Information-gathering programs are redirected”, “Conceptual frames shift”, etc. The high physiological alertness program has to be a bit sticky, otherwise a predator stalking you could turn it off by sitting down and you’d be like “oh, I guess I’m not in danger anymore”. If you’ve been successfully tricked by a predator into thinking that it broke off the hunt when it really was finding a better position to attack you from, the program’s gonna be a bit stickier, since its job is to keep you from becoming food.
To get away from the analogies, I really appreciate this piece and how it was written. I specifically appreciate it because it doesn’t feel like it is an attempt to make me more vulnerable to something bad. Also I think it might have helped me get a bit of a felt sense shift.
To get away from the analogies, I really appreciate this piece and how it was written. I specifically appreciate it because it doesn’t feel like it is an attempt to make me more vulnerable to something bad. Also I think it might have helped me get a bit of a felt sense shift.
Thank you for sharing that, I’m happy to hear it. :)
I want to mention here that the war example is an example of where there is an adversarial scenario, or adversarial game, and applying an adversarial frame is usually not the correct decision to do, and importantly given that the most perverse scenarios usually can’t be dealt with without exotic physics due to computational complexity reason, you usually shouldn’t focus on adversarial scenarios, and here Kaj Sotala is very, very correct on this post.
This logic can be taken too far—I don’t see the point of feeling constantly anxious -, but at least on an intellectual level, I think it does make a certain amount of sense. It’s hard to notice the insanity or inadequacy of the world until it affects you personally. Some examples of this:
People buy insurance to be safe from <disaster>, but insurance companies often don’t want to pay out. So when you buy insurance, you might incorrectly feel safe, but only notice that you weren’t if a disaster actually happens.
If you’ve never been ill, then it’s easy to believe that if you got ill, you could just go to the doctor and be healed. Sometimes things do work that way. At other times, you might learn that reality is more complicated, and civilization less competent, than previously thought.
I think the Covid pandemic, and the (worldwide!) inadequate policy response, should’ve been at least a bit traumatizing to every person on this planet. Not necessarily on an emotional level, but certainly on an intellectual level. There’s a kind of trust one can only have in institutions one knows ~nothing about (related: Gell-Mann amnesia), and the pandemic is the kind of event that should’ve deservedly broken this kind of trust.
Agree. (I’m not saying that losing one’s trust in civilizational adequacy is necessarily a bad thing on net, just that it can also lead to some maladaptive thought patterns.)
Wait, really?! If this is true then I had severely overestimated the sanity minimum of rationalists. The objections in your post are all true, of course, but they should also pop out in a sane person’s mind within like 15 seconds of actually hearing that statement...
It’s the kind of thought that one might have if they have a (possibly low-grade) anxiety issue: you feel anxious and like the world isn’t safe and you need to be alert all the time, so then your mind takes that observation as an axiom and generates intellectual reasoning to justify it. And I think there’s a subset of rationalists who were driven to rationality because they were anxious; Eliezer even has an old post suggesting that in order to be really dedicated to rationality, you need to have undergone trauma that broke your basic trust in people:
In retrospect, it’s too surprising that people might get anxious and maladaptive thought patterns if “normality scares the hell out of” them.
Re “they should also pop out in a sane person’s mind within like 15 seconds of actually hearing that statement” I agree with that in the abstract; few people will say that a state of high physiological alertness/vigilance is Actually A Good Idea to cultivate for threats/risks not usefully countered by the effects of high physiological alertness.
Being able to reason about that in the abstract doesn’t necessarily transfer to actually stopping doing that. Like personally, I feel like being told something along the line of “you’re working yourself up into a counterproductive state of high physiological alertness about the risks of [risk] and counterproductively countering that with incredibly abstract thought disconnected from useful action” is not something I am very good at hearing from most people when I am in that sort of extraordinarily afraid state. It can really feel like someone wants to manipulate me into thinking that [risk] is not a big deal, or discourage me from doing anything about [risk], or that they’re seeking to make me more vulnerable to [risk]. These days this is rarely the case; but the heuristic still sticks around. Maybe I should find its commanding officer so it can be told by someone it trusts that it’s okay to stand down...
With the military analogy; it’s like you’d been asked to keep an eye out for a potential threat, and your commanding officer tells you on the radio to get on REDCON 1. Later on you hear an unfamiliar voice on the radio which doesn’t authenticate itself, and it keeps telling you that your heightened alertness is actually counterproductive and that you should stand down.
Would you stand down? No, you’d be incredibly suspicious! Interfering with the enemy’s communication is carte blanche in war. Are there situations where you would indeed obey the order from the unfamiliar voice? Perhaps! Maybe your commanding officer’s vehicle got destroyed, or more prosaically, maybe his radio died. But it would have to be in a situation where you’re confident it represents legitimate military authority. It would be a high bar to clear, since if you do stand down and it was an enemy ruse, you’re in a very bad situation regardless if you get captured by the enemy or if you get court-martialed for disobeying orders. If it seems like standing down makes zero tactical/strategic sense, your threshold would be even higher! In the extreme, nothing short of your commanding officer showing up in person would be enough.
All of this is totally consistent with the quoted section in OP that mentions “Goals and motivational weightings change”, “Information-gathering programs are redirected”, “Conceptual frames shift”, etc. The high physiological alertness program has to be a bit sticky, otherwise a predator stalking you could turn it off by sitting down and you’d be like “oh, I guess I’m not in danger anymore”. If you’ve been successfully tricked by a predator into thinking that it broke off the hunt when it really was finding a better position to attack you from, the program’s gonna be a bit stickier, since its job is to keep you from becoming food.
To get away from the analogies, I really appreciate this piece and how it was written. I specifically appreciate it because it doesn’t feel like it is an attempt to make me more vulnerable to something bad. Also I think it might have helped me get a bit of a felt sense shift.
Thank you for sharing that, I’m happy to hear it. :)
I want to mention here that the war example is an example of where there is an adversarial scenario, or adversarial game, and applying an adversarial frame is usually not the correct decision to do, and importantly given that the most perverse scenarios usually can’t be dealt with without exotic physics due to computational complexity reason, you usually shouldn’t focus on adversarial scenarios, and here Kaj Sotala is very, very correct on this post.
This logic can be taken too far—I don’t see the point of feeling constantly anxious -, but at least on an intellectual level, I think it does make a certain amount of sense. It’s hard to notice the insanity or inadequacy of the world until it affects you personally. Some examples of this:
People buy insurance to be safe from <disaster>, but insurance companies often don’t want to pay out. So when you buy insurance, you might incorrectly feel safe, but only notice that you weren’t if a disaster actually happens.
If you’ve never been ill, then it’s easy to believe that if you got ill, you could just go to the doctor and be healed. Sometimes things do work that way. At other times, you might learn that reality is more complicated, and civilization less competent, than previously thought.
I think the Covid pandemic, and the (worldwide!) inadequate policy response, should’ve been at least a bit traumatizing to every person on this planet. Not necessarily on an emotional level, but certainly on an intellectual level. There’s a kind of trust one can only have in institutions one knows ~nothing about (related: Gell-Mann amnesia), and the pandemic is the kind of event that should’ve deservedly broken this kind of trust.
Agree. (I’m not saying that losing one’s trust in civilizational adequacy is necessarily a bad thing on net, just that it can also lead to some maladaptive thought patterns.)