I felt a similar pang of dissent when I read Feeling Rational for the first time, and I couldn’t quite put my finger on it. After reading your comment and thinking on it, I think I’ll formulate it thus: “A distinction should be made between seeking comfort and seeking comfort in false beliefs. The former is acceptable and the latter is unacceptable.”
I guess the most important message I might have to give on the topic of how to deal with death is: It’s alright to be angry; it’s alright not to be comforted. For me, the prototype of this experience will always be the funeral of my little brother, who died at the age of 19. I went to his funeral; I was the only open atheist there, I believe. And, for me, there was no confusion in that experience. Pure anger. Pure wishing-it-didn’t-happen. No need to seek comfort for it. And that may not have been a pleasant experience, but I think that it was, in a fundamental sense, more healthy, for being less conflicted, than what I saw on the faces of my relatives, and my parents, as they tried to attribute it to God.
People often seek comfort in false beliefs when faced with the death of themselves or others, and the implication is that this is very harmful when you take cryonics seriously like I do. Since this was the context of the panel, I can see how publicly encouraging people to avoid those false beliefs is instrumentally rational. But those words didn’t completely mesh with me, because it felt like he was encouraging people to suffer more than they needed to if they didn’t have false beliefs. It didn’t feel like the difference between seeking comfort and believing false but comforting things had been adequately distinguished.
It’s entirely possible that I just harmfully misinterpreted his words, but I should say that I had some difficulty when my mother died because I felt like it was Bad for me not to realize and feel, and to not want to realize and feel, on a gut level, how much value was destroyed when she died. That not feeling it, and not wanting to feel it, was simply inaccurate, and that I was making a mistake that I should feel bad about because my emotions were inaccurate; wrong; Bad. But there’s a difference between the belief and the affect. There’s a sense in which System 1 doesn’t even know that my mother is dead. I haven’t let it, because it does no good. The conventional wisdom is that this is suppressing Feelings That Need To Be Felt, but I don’t feel generally stressed, like I’m holding something in that I need to get out. I’m never going to be okay with it in the sense that it will be congruent with my values. And viscerally feeling it just hurts. System 2 can have accurate verbal beliefs about death and cryonics without System 1′s unintended side effect of thrashing around trying to tell me that something is very wrong with one of my parent-child relationship bonds. System 2 can have accurate beliefs without System 1 tracking their emotional implications. Feeling death on a gut level is only instrumentally valuable if you have false beliefs about death and longevity is a term in your utility function. Otherwise, it’s just torture.
But it’s also pretty clear to me that Eliezer doesn’t want people to suffer, so I want to be charitable and consider how he might come to the sorts of conclusions and say the sorts of things that he has. My hypothesis is that, for Eliezer, cognitive consonance is its own comfort. The idea that cognitive dissonance is viscerally upsetting for him fits my model well. For him, perceiving his beliefs and emotions as congruent is as comforting as having true beliefs and regulating my emotions out of existence is for me. Later on, in speech and writing, the comfort of cognitive consonance and the instrumental value of viscerally feeling the evil of death when your beliefs about death are false are conflated into one larger and less precise point.
And sometimes we must learn to remove certain facts from our mind, so they are only called upon when necessary, else they will severely depress us.
Yes, your summary line at the top feels like exactly the distinction I was trying to make. Rereading my comment, I think I was attacking the straw man position, that after admitting that EY had said rationality and feelings were not orthogonal, I was arguing as though he’d said they were the opposite of that—directly related, parallel, etc.
You could explain Eliezer’s non-standard approach in terms of his being fundamentally different to many in that regard; or you could explain it as depending more heavily on his situation: he was a highly intelligent atheist surrounded by confused religionists. Alleviating the confusion from your pain is surely a thing one would hold up as in that situation.
“A distinction should be made between seeking comfort and seeking comfort in false beliefs. The former is acceptable and the latter is unacceptable.”
I felt a similar pang of dissent when I read Feeling Rational for the first time, and I couldn’t quite put my finger on it. After reading your comment and thinking on it, I think I’ll formulate it thus: “A distinction should be made between seeking comfort and seeking comfort in false beliefs. The former is acceptable and the latter is unacceptable.”
For me, some of Eliezer’s words at the ‘How Should Rationalists Approach Death’ panel at Skepticon 4 were brought to mind:
People often seek comfort in false beliefs when faced with the death of themselves or others, and the implication is that this is very harmful when you take cryonics seriously like I do. Since this was the context of the panel, I can see how publicly encouraging people to avoid those false beliefs is instrumentally rational. But those words didn’t completely mesh with me, because it felt like he was encouraging people to suffer more than they needed to if they didn’t have false beliefs. It didn’t feel like the difference between seeking comfort and believing false but comforting things had been adequately distinguished.
It’s entirely possible that I just harmfully misinterpreted his words, but I should say that I had some difficulty when my mother died because I felt like it was Bad for me not to realize and feel, and to not want to realize and feel, on a gut level, how much value was destroyed when she died. That not feeling it, and not wanting to feel it, was simply inaccurate, and that I was making a mistake that I should feel bad about because my emotions were inaccurate; wrong; Bad. But there’s a difference between the belief and the affect. There’s a sense in which System 1 doesn’t even know that my mother is dead. I haven’t let it, because it does no good. The conventional wisdom is that this is suppressing Feelings That Need To Be Felt, but I don’t feel generally stressed, like I’m holding something in that I need to get out. I’m never going to be okay with it in the sense that it will be congruent with my values. And viscerally feeling it just hurts. System 2 can have accurate verbal beliefs about death and cryonics without System 1′s unintended side effect of thrashing around trying to tell me that something is very wrong with one of my parent-child relationship bonds. System 2 can have accurate beliefs without System 1 tracking their emotional implications. Feeling death on a gut level is only instrumentally valuable if you have false beliefs about death and longevity is a term in your utility function. Otherwise, it’s just torture.
But it’s also pretty clear to me that Eliezer doesn’t want people to suffer, so I want to be charitable and consider how he might come to the sorts of conclusions and say the sorts of things that he has. My hypothesis is that, for Eliezer, cognitive consonance is its own comfort. The idea that cognitive dissonance is viscerally upsetting for him fits my model well. For him, perceiving his beliefs and emotions as congruent is as comforting as having true beliefs and regulating my emotions out of existence is for me. Later on, in speech and writing, the comfort of cognitive consonance and the instrumental value of viscerally feeling the evil of death when your beliefs about death are false are conflated into one larger and less precise point.
Furthermore, we do it anyway.
Yes, your summary line at the top feels like exactly the distinction I was trying to make. Rereading my comment, I think I was attacking the straw man position, that after admitting that EY had said rationality and feelings were not orthogonal, I was arguing as though he’d said they were the opposite of that—directly related, parallel, etc.
You could explain Eliezer’s non-standard approach in terms of his being fundamentally different to many in that regard; or you could explain it as depending more heavily on his situation: he was a highly intelligent atheist surrounded by confused religionists. Alleviating the confusion from your pain is surely a thing one would hold up as in that situation.
A fine quote.