How to trigger fear and panic in general population, such that they organise mass protest against the creation of superintelligence
Disclaimer
Quick Note
(Contains untested hypotheses)
How to trigger fear and panic
If AI capabilities continue to grow on same trend, then by 2027 I expect
The US natsec establishment will be convinced that AI is as important as nukes, even if they’re not ASI-pilled, and will publicly declare that “we” are in Cold War. This will be helpful for triggering fear in general public.
A lot more people in general public will actually try the latest AI models and hence be convinced that AI is as important as nukes, even if they’re not ASI-pilled.
I think US natsec circles are a machinery with not much agency. Just convincing a few people about ASI or xrisk is not enough to change the direction of the entire machinery.
This also why I am not optimistic on just persuading people in natsec to stop building ASI.
You need the woke protest machinery in the US to organise mass protest and counter the natsec machinery. It is important to work with people who already have the machinery required to organise large-scale protests, there is no time to build it from scratch.
I expect mass protest will only have a medium influence on the electoral incentives of US congress and senate. Previous mass protests in the US have had only a medium-sized influence at best—be it against nuclear arms race, or Vietnam war, or Syria war and Iraq WMDs, or now Palestine and Ukraine wars.
I think we need to use fear of death to trigger fight-or-flight response in people.
War footage is especially important, to trigger people to protest against war. I think some protestors of Ukraine and Palestine wars have seen war footage and this motivated them.
Talking about death, visiting graveyards, etc seems important.
A lot of people have organised a lot of their psyche to cope with their fear of death. Actually breaking people out of this and reminding them they could die soon can trigger them into fight-or-flight.
I think we need to use fear of outgroup and hatred of outgroup to trigger fight-or-flight response in people.
Every well-organised political group in the US seems united in their hatred of another group. Christians hate atheists. Rural people hate urban people. Communists hate the tech industry. A lot of worker unions hate anyone who can take their job. Tech founders hate people who talk about ethics. Tech academia hates the tech industry too.
With the right rhetoric, Silicon Valley and Big Tech in particular can easily become the outgroup for most of the groups in the US.
Each political group needs to be triggered into fight-or-flight by a youtuber thought leader who actually believes in the ideals of that group. People are very bad at finding truth but they’re very good at detecting authenticity in other people. If you don’t actually believe in the ideals of a group, and you are not exceptionally skilled, don’t try to become a thought leader for that group. Ally with the existing thought leader instead, and ask them to talk about ASI.
Fear of the outgroup comes from the outgroup having more power than you.
The truth takes too long to transmit. Do not appeal to truth alone.
Most people are trapped in groups where only a few people are truth-seekers and most people will mindlessly copy-paste their family and friends. Triggering a belief cascade across an entire group takes too long, and humanity might be extinct before that happens.
Reinforcing their existing false beliefs is way faster than convincing them of a truth, and this is how many youtubers become popular.
I think just sharing the truth is powerful (more than a lot of people think), but the timelines are so short, this may not be enough to convince most of the public.
Most people focus on the emotional content of a message more than the factual content.
This also applies to groups that contain a lot of truth-seekers, but to a weaker degree. Truth is more useful when convincing such people. Such people also disproportionately occupy some (but not all) of the centres of power in society.
Just don’t do this. This isn’t the kind of plan which works in real life.
Appealing to outgroup fear just gets you a bunch of paranoid groups who never talk to one another.
Truth-telling is relatively robust because you automatically end up on the same side as other truth-tellers and you can all automatically agree on messaging (roughly).
The only exception is leveraging fear of death, which is reasonable in smaller doses IMO when talking about AI, since dying is actually on the table.
Because scared people do random things. They may elect someone who promises to protect them, and those person’s goals will be practically random (except for the part where they want to get more power). Or they may refuse good proposals on risk reduction, because everything will seem equally scary. Etc.
Idk if this is true, but let’s say it is true. In my mind, this is still an improvement over the current situation, which is that the public will be unaware while AI lab heads and US intelligence heads rush to build the Machine God and succeed, and then either cause human extinction or build an immortal dictatorship.
Ignoring the serious ethical issues inherent in manipulating people’s emotions for instrumental gain, this strategy seems highly (I’d say 95%+) likely to backfire. Intergroup relations research shows that strong us-vs-them dynamics leads to radicalization and loss of control of social movements, and motivated reasoning literature demonstrates that identity-defining beliefs inhibit evidence-based reasoning. Moreover, even if this somehow worked, cultivating hatred of Silicon Valley and Big Tech would likely lead to the persecution of EY-types and other AI safety researchers with the most valuable insights on the matter.
Moreover, even if this somehow worked, cultivating hatred of Silicon Valley and Big Tech would likely lead to the persecution of EY-types and other AI safety researchers with the most valuable insights on the matter.
2025-10-19
How to trigger fear and panic in general population, such that they organise mass protest against the creation of superintelligence
Disclaimer
Quick Note
(Contains untested hypotheses)
How to trigger fear and panic
If AI capabilities continue to grow on same trend, then by 2027 I expect
The US natsec establishment will be convinced that AI is as important as nukes, even if they’re not ASI-pilled, and will publicly declare that “we” are in Cold War. This will be helpful for triggering fear in general public.
A lot more people in general public will actually try the latest AI models and hence be convinced that AI is as important as nukes, even if they’re not ASI-pilled.
I think US natsec circles are a machinery with not much agency. Just convincing a few people about ASI or xrisk is not enough to change the direction of the entire machinery.
This also why I am not optimistic on just persuading people in natsec to stop building ASI.
You need the woke protest machinery in the US to organise mass protest and counter the natsec machinery. It is important to work with people who already have the machinery required to organise large-scale protests, there is no time to build it from scratch.
I expect mass protest will only have a medium influence on the electoral incentives of US congress and senate. Previous mass protests in the US have had only a medium-sized influence at best—be it against nuclear arms race, or Vietnam war, or Syria war and Iraq WMDs, or now Palestine and Ukraine wars.
I think we need to use fear of death to trigger fight-or-flight response in people.
War footage is especially important, to trigger people to protest against war. I think some protestors of Ukraine and Palestine wars have seen war footage and this motivated them.
Talking about death, visiting graveyards, etc seems important.
A lot of people have organised a lot of their psyche to cope with their fear of death. Actually breaking people out of this and reminding them they could die soon can trigger them into fight-or-flight.
I think we need to use fear of outgroup and hatred of outgroup to trigger fight-or-flight response in people.
Every well-organised political group in the US seems united in their hatred of another group. Christians hate atheists. Rural people hate urban people. Communists hate the tech industry. A lot of worker unions hate anyone who can take their job. Tech founders hate people who talk about ethics. Tech academia hates the tech industry too.
With the right rhetoric, Silicon Valley and Big Tech in particular can easily become the outgroup for most of the groups in the US.
See also: I can tolerate anybody except the outgroup, by Scott Alexander
Each political group needs to be triggered into fight-or-flight by a youtuber thought leader who actually believes in the ideals of that group. People are very bad at finding truth but they’re very good at detecting authenticity in other people. If you don’t actually believe in the ideals of a group, and you are not exceptionally skilled, don’t try to become a thought leader for that group. Ally with the existing thought leader instead, and ask them to talk about ASI.
Fear of the outgroup comes from the outgroup having more power than you.
The truth takes too long to transmit. Do not appeal to truth alone.
Most people are trapped in groups where only a few people are truth-seekers and most people will mindlessly copy-paste their family and friends. Triggering a belief cascade across an entire group takes too long, and humanity might be extinct before that happens.
Reinforcing their existing false beliefs is way faster than convincing them of a truth, and this is how many youtubers become popular.
I think just sharing the truth is powerful (more than a lot of people think), but the timelines are so short, this may not be enough to convince most of the public.
Most people focus on the emotional content of a message more than the factual content.
This also applies to groups that contain a lot of truth-seekers, but to a weaker degree. Truth is more useful when convincing such people. Such people also disproportionately occupy some (but not all) of the centres of power in society.
Just don’t do this. This isn’t the kind of plan which works in real life.
Appealing to outgroup fear just gets you a bunch of paranoid groups who never talk to one another.
Truth-telling is relatively robust because you automatically end up on the same side as other truth-tellers and you can all automatically agree on messaging (roughly).
The only exception is leveraging fear of death, which is reasonable in smaller doses IMO when talking about AI, since dying is actually on the table.
Why is this a problem? Also remember they all ideally share common outgroups.
Because scared people do random things. They may elect someone who promises to protect them, and those person’s goals will be practically random (except for the part where they want to get more power). Or they may refuse good proposals on risk reduction, because everything will seem equally scary. Etc.
Idk if this is true, but let’s say it is true. In my mind, this is still an improvement over the current situation, which is that the public will be unaware while AI lab heads and US intelligence heads rush to build the Machine God and succeed, and then either cause human extinction or build an immortal dictatorship.
What am I missing?
Ignoring the serious ethical issues inherent in manipulating people’s emotions for instrumental gain, this strategy seems highly (I’d say 95%+) likely to backfire. Intergroup relations research shows that strong us-vs-them dynamics leads to radicalization and loss of control of social movements, and motivated reasoning literature demonstrates that identity-defining beliefs inhibit evidence-based reasoning. Moreover, even if this somehow worked, cultivating hatred of Silicon Valley and Big Tech would likely lead to the persecution of EY-types and other AI safety researchers with the most valuable insights on the matter.
I just skimmed it. It’s not obvious to me why more radicalisation and less evidence-based reasoning is a problem.
The bottleneck is not lack of evidence, it’s lack of fear.
I can live with this.
I will read the rest of your links in a while.