If you pressed them about how AIs are actually created or how that specific creation process could cause AIs to be misaligned, they wouldn’t be able to tell you much.
I don’t think people knew much about how nuclear worked and that didn’t stop a movement from stopping its development.
I think that people are only going to support a movement if they feel they understand it enough.
It seems like nuclear is pretty simple, and people seem to think they know enough to have had movements around it.
Additionally, it seems like people often do understand enough about nuclear. If you asked a random person on the street about how nuclear is created and how it could go wrong, they can give you a pretty clear and reasonable answer. People understand that nuclear relies on radioactive materials to produce chain reactions. It can go wrong if these chain reactions go out of control on accident or if they are intentionally used in war. Their explanations are even good enough to convince others to join their movement.
On the other hand, AI is kind of complex, and I think people might feel like they need to know more to feel motivated enough to participate in a movement about it.
If you walked up to a random person on the street who had read IABIED, I think they probably couldn’t answer these questions in a way that is satisfactory enough to them to participate in a movement or to convince others to join their cause.
If you walked up to a random person on the street who had read IABIED, I think they probably couldn’t answer these questions in a way that is satisfactory enough to them to participate in a movement or to convince others to join their cause.
I agree they wouldn’t be able to answer.
On the other hand, AI is kind of complex, and I think people might feel like they need to know more to feel motivated enough to participate in a movement about it.
I disagree with the above, there are already citizens up in arms against AI. For instance I know many friends who post stories about the water lost to AI (yeah, I know...) or how everything is hallucinated (their last model used was 4o on the free plan), which is a very recurrent claim in my old field (law) where this is particularly important.
I think you overestimate the level of understanding of the general population of technology. (nuclear in this case but most actually)
if they feel they understand it enough
Exactly, they feel they understand it but have actually no idea.
In France, we’re arguably the country with the most citizen exposure to nuclear power. Until not that long ago, most people thought the water vapor coming out of the plants was pollution. If you told most people that basically a nuclear power plant is a huge steam machine, they would be flabbergasted.
People understand that nuclear relies on radioactive materials to produce chain reactions.
In a street poll of 1000 almost representative people, I would bet less than 40% of the french population would be able to say this. I have had to defend nuclear power against arguments you would barely believe the stupidity of.
In 2019(!!) a poll was made and 10% thought that oil and gaz were contributing less co2 and 11% for coal. 70% believed that nuclear power was contributing to the emission of greenhouse gases and although they’re technically right, it’s painfully obvious in the poll that they believe it’s emitting orders of magnitude more than they actually do. This 2017 one is even more shocking.
Most people will talk to you about radioactive waste (we are burying them in a special underground reserve) as if it was a danger much greater than the consequences of oil, gaz and coal.
In Belgium, a poll was made four years ago (they share plants with us and have some of their own) and there was still 13% who thought water vapor was radioactive gaz, 9% thought it was CO2 and 20% didn’t know.
Anyway, sorry for going a bit into a rabbit hole but my point is that most people feel they understand enough most technologies when they actually don’t and they’re ready to have movements against those same technologies. Which is why I don’t think that specific point was an issue to raise a movement to pause AI : the fact that people don’t actually understand AI is imo even better to have people rally with you, you can tell each groups different stuff. (yeah that’s terrible but that’s an important part of politics)
I think I’m much less hopeful about resistance to AI than you are. It seems like there’s just a general trend of new technologies being developed and deployed so quickly that society doesn’t have to ability to develop coherent thoughts about them and to produce meaningful regulation for them before they’ve already produced great harm. For instance, social media seems to be quite harmful to teens and yet society still hasn’t really mobilized to reduce social media use among teenagers, despite social media having been around for more than a decade.
It seems to me like AI x-risk is just a bit too far out of the Overton window for people to really take it seriously right now, and it seems like people aren’t organizing quickly enough to respond to other concerns about AI. I’m probably underestimating how seriously people take IABIED since I live in a highly educated area, where people really expect you to defend your beliefs if they’re controversial.
I don’t think people knew much about how nuclear worked and that didn’t stop a movement from stopping its development.
Hi exmateriae,
I think that people are only going to support a movement if they feel they understand it enough.
It seems like nuclear is pretty simple, and people seem to think they know enough to have had movements around it.
Additionally, it seems like people often do understand enough about nuclear. If you asked a random person on the street about how nuclear is created and how it could go wrong, they can give you a pretty clear and reasonable answer. People understand that nuclear relies on radioactive materials to produce chain reactions. It can go wrong if these chain reactions go out of control on accident or if they are intentionally used in war. Their explanations are even good enough to convince others to join their movement.
On the other hand, AI is kind of complex, and I think people might feel like they need to know more to feel motivated enough to participate in a movement about it.
If you walked up to a random person on the street who had read IABIED, I think they probably couldn’t answer these questions in a way that is satisfactory enough to them to participate in a movement or to convince others to join their cause.
AI is less understood than nuclear, I agree.
I agree they wouldn’t be able to answer.
I disagree with the above, there are already citizens up in arms against AI. For instance I know many friends who post stories about the water lost to AI (yeah, I know...) or how everything is hallucinated (their last model used was 4o on the free plan), which is a very recurrent claim in my old field (law) where this is particularly important.
I think you overestimate the level of understanding of the general population of technology. (nuclear in this case but most actually)
Exactly, they feel they understand it but have actually no idea.
In France, we’re arguably the country with the most citizen exposure to nuclear power. Until not that long ago, most people thought the water vapor coming out of the plants was pollution. If you told most people that basically a nuclear power plant is a huge steam machine, they would be flabbergasted.
In a street poll of 1000 almost representative people, I would bet less than 40% of the french population would be able to say this. I have had to defend nuclear power against arguments you would barely believe the stupidity of.
In 2019(!!) a poll was made and 10% thought that oil and gaz were contributing less co2 and 11% for coal. 70% believed that nuclear power was contributing to the emission of greenhouse gases and although they’re technically right, it’s painfully obvious in the poll that they believe it’s emitting orders of magnitude more than they actually do. This 2017 one is even more shocking.
Most people will talk to you about radioactive waste (we are burying them in a special underground reserve) as if it was a danger much greater than the consequences of oil, gaz and coal.
In Belgium, a poll was made four years ago (they share plants with us and have some of their own) and there was still 13% who thought water vapor was radioactive gaz, 9% thought it was CO2 and 20% didn’t know.
Anyway, sorry for going a bit into a rabbit hole but my point is that most people feel they understand enough most technologies when they actually don’t and they’re ready to have movements against those same technologies. Which is why I don’t think that specific point was an issue to raise a movement to pause AI : the fact that people don’t actually understand AI is imo even better to have people rally with you, you can tell each groups different stuff. (yeah that’s terrible but that’s an important part of politics)
Thanks for the thoughtful reply!
I think I’m much less hopeful about resistance to AI than you are. It seems like there’s just a general trend of new technologies being developed and deployed so quickly that society doesn’t have to ability to develop coherent thoughts about them and to produce meaningful regulation for them before they’ve already produced great harm. For instance, social media seems to be quite harmful to teens and yet society still hasn’t really mobilized to reduce social media use among teenagers, despite social media having been around for more than a decade.
It seems to me like AI x-risk is just a bit too far out of the Overton window for people to really take it seriously right now, and it seems like people aren’t organizing quickly enough to respond to other concerns about AI. I’m probably underestimating how seriously people take IABIED since I live in a highly educated area, where people really expect you to defend your beliefs if they’re controversial.