I’m skeptical of the the trauma model. I think personally I’m more grim and more traumatized than most people around here, and other people are too happy and not neurotic enough to understand how fucked alignment research is; and yet I have much longer timelines than most people around here. (Not saying anyone should become less happy or more neurotic lol.)
Re/ poor group epistemology, we’ll agree that the overall phenomenon is multifactorial. Partially it’s higher-order emergent deference. See “Dangers of deference”.
I agree that people want something with their short timelines. But my guese would be less about trauma, and more about community. This sort of echoes what you say about a (lack of a) positive vision. I would say that making positive visions is a fundamental gap in the modern world, rationalists have and feel that gap, and consequently they feel badly. And they feel some taste of shared vision from being all worried about short timelines together. The strong undercurrent powering all this is: huddling around the fire, in the big open dark night. If you work on things that other people aren’t working on, you don’t get to live in the world with other people as much. If you “believe” things that other people don’t believe, then naturally you won’t work on things that other people are working on. If you explicitly point this out as a driver of “belief”, usually people will be like “ah, yeah, true” but then go back to pretending that their beliefs come from reasons that are about the propositional content.
I agree that the lack of a shared positive vision is salient. Part of mine is in “Genomic emancipation”, and I discuss some of the general reasons that having a positive shared vision matters in the appendix “Why envision genomic emancipation?”.
I think personally I’m more grim and more traumatized than most people around here, and other people are too happy and not neurotic enough to understand how fucked alignment research is; and yet I have much longer timelines than most people around here.
Two notes:
I think this is a little evidence against the trauma model, but not much. Most forms of trauma don’t cause people to become AI doomers, just like most forms of trauma don’t cause most people to become alcoholics. I think the form of the trauma, and the set of coping mechanisms, and the set of life opportunities, all have to converge to result in this specific flavor of doom focus. (And I hypothesize that LW has long been an attractor for people who’ve been hit with that set!)
I don’t care that much about the trauma model in particular. I should have been clearer in the OP. What I meant was more like, “Gosh it sure seems to me that fixating on doom seems to be a drive independent of truth. Keeps happening all over the place. Sure looks like that’s maybe happening here too. That seems important.” The trauma thing was meant to both (a) highlight the type of phenomenon I’m talking about and (b) give an example of a loosely gearsy mechanism for producing it. I think you’re offering a slightly different model — which is great! I think they could be empirically distinguished (and aren’t mutually exclusive, and the whole scene is probably multifaceted).
A couple remarks:
I’m skeptical of the the trauma model. I think personally I’m more grim and more traumatized than most people around here, and other people are too happy and not neurotic enough to understand how fucked alignment research is; and yet I have much longer timelines than most people around here. (Not saying anyone should become less happy or more neurotic lol.)
Re/ poor group epistemology, we’ll agree that the overall phenomenon is multifactorial. Partially it’s higher-order emergent deference. See “Dangers of deference”.
I agree that people want something with their short timelines. But my guese would be less about trauma, and more about community. This sort of echoes what you say about a (lack of a) positive vision. I would say that making positive visions is a fundamental gap in the modern world, rationalists have and feel that gap, and consequently they feel badly. And they feel some taste of shared vision from being all worried about short timelines together. The strong undercurrent powering all this is: huddling around the fire, in the big open dark night. If you work on things that other people aren’t working on, you don’t get to live in the world with other people as much. If you “believe” things that other people don’t believe, then naturally you won’t work on things that other people are working on. If you explicitly point this out as a driver of “belief”, usually people will be like “ah, yeah, true” but then go back to pretending that their beliefs come from reasons that are about the propositional content.
I agree that the lack of a shared positive vision is salient. Part of mine is in “Genomic emancipation”, and I discuss some of the general reasons that having a positive shared vision matters in the appendix “Why envision genomic emancipation?”.
Two notes:
I think this is a little evidence against the trauma model, but not much. Most forms of trauma don’t cause people to become AI doomers, just like most forms of trauma don’t cause most people to become alcoholics. I think the form of the trauma, and the set of coping mechanisms, and the set of life opportunities, all have to converge to result in this specific flavor of doom focus. (And I hypothesize that LW has long been an attractor for people who’ve been hit with that set!)
I don’t care that much about the trauma model in particular. I should have been clearer in the OP. What I meant was more like, “Gosh it sure seems to me that fixating on doom seems to be a drive independent of truth. Keeps happening all over the place. Sure looks like that’s maybe happening here too. That seems important.” The trauma thing was meant to both (a) highlight the type of phenomenon I’m talking about and (b) give an example of a loosely gearsy mechanism for producing it. I think you’re offering a slightly different model — which is great! I think they could be empirically distinguished (and aren’t mutually exclusive, and the whole scene is probably multifaceted).
Overall I like your comment.