I think you’re wrong to be psychoanalysing why people aren’t paying attention to your work. You’re overcomplicating it. Most people just think you’re wrong upon hearing a short summary, and don’t trust you enough to spend time learning the details. Whether your scenario is important or not, from your perspective it’ll usually look like people are bouncing off for bad reasons.
For example, I read the executive summary. For several shallow reasons,[1] the scenario seemed unlikely and unimportant. I didn’t expect there to be better arguments further on. So I stopped. Other people have different world models and will bounce off for different reasons.
Which isn’t to say it’s wrong (that’s just my current weakly held guess). My point is just that even if you’re correct, the way it looks a priori to most worldviews is sufficient to explain why people are bouncing off it and not engaging properly.
Perhaps I’ll encounter information in the future that indicates my bouncing off was a mistake, and I’ll go back.
There are a couple of layers of maybes, so the scenario doesn’t seem likely. I expect power to be more concentrated. I expect takeoff to be faster. I expect capabilities to have a high cap. I expect alignment to be hard for any goal. Something about maintaining a similar societal structure without various chaotic game-board-flips seems unlikely. The goals-instilled-in-our-replacements are pretty specific (institution-aligned), and pretty obviously misaligned from overall human flourishing. Sure humans are usually myopic, but we do sometimes consider the consequences and act against local incentives.
I don’t know whether these reasons are correct, or how well you’ve argued against them. They’re weakly held and weakly considered, so I wouldn’t have usually written them down. They are just here to make my point more concrete.
I think ‘people aren’t paying attention to your work’ is somewhat different situation than voiced in the original post. I’m discussing specific ways in which people engage with the argument, as opposed to just ignoring it. It is the baseline that most people ignore most arguments most of time.
Also it’s probably worth noting the ways seem somewhat specific to the crowd over-represented here—in different contexts people are engaging with it in different ways.
I think the shell games point is interesting though. It’s not psychoanalysing (one can think that people are in denial or have rational beliefs about this, not much point second guessing too far), it’s pointing out a specific fallacy: a sort of god of the gaps in which every person with a focus on subsystem X assumes the problem will be solved in subsystem Y, which they understand or care less about because it’s not their specialty. If everyone does it, that does indeed lead to completely ignoring serious problems due to a sort of bystander effect.
I think you’re wrong to be psychoanalysing why people aren’t paying attention to your work. You’re overcomplicating it. Most people just think you’re wrong upon hearing a short summary, and don’t trust you enough to spend time learning the details. Whether your scenario is important or not, from your perspective it’ll usually look like people are bouncing off for bad reasons.
For example, I read the executive summary. For several shallow reasons,[1] the scenario seemed unlikely and unimportant. I didn’t expect there to be better arguments further on. So I stopped. Other people have different world models and will bounce off for different reasons.
Which isn’t to say it’s wrong (that’s just my current weakly held guess). My point is just that even if you’re correct, the way it looks a priori to most worldviews is sufficient to explain why people are bouncing off it and not engaging properly.
Perhaps I’ll encounter information in the future that indicates my bouncing off was a mistake, and I’ll go back.
There are a couple of layers of maybes, so the scenario doesn’t seem likely. I expect power to be more concentrated. I expect takeoff to be faster. I expect capabilities to have a high cap. I expect alignment to be hard for any goal. Something about maintaining a similar societal structure without various chaotic game-board-flips seems unlikely. The goals-instilled-in-our-replacements are pretty specific (institution-aligned), and pretty obviously misaligned from overall human flourishing. Sure humans are usually myopic, but we do sometimes consider the consequences and act against local incentives.
I don’t know whether these reasons are correct, or how well you’ve argued against them. They’re weakly held and weakly considered, so I wouldn’t have usually written them down. They are just here to make my point more concrete.
I think ‘people aren’t paying attention to your work’ is somewhat different situation than voiced in the original post. I’m discussing specific ways in which people engage with the argument, as opposed to just ignoring it. It is the baseline that most people ignore most arguments most of time.
Also it’s probably worth noting the ways seem somewhat specific to the crowd over-represented here—in different contexts people are engaging with it in different ways.
I think the shell games point is interesting though. It’s not psychoanalysing (one can think that people are in denial or have rational beliefs about this, not much point second guessing too far), it’s pointing out a specific fallacy: a sort of god of the gaps in which every person with a focus on subsystem X assumes the problem will be solved in subsystem Y, which they understand or care less about because it’s not their specialty. If everyone does it, that does indeed lead to completely ignoring serious problems due to a sort of bystander effect.