We have dangerous knowledge like nuclear weapons or bioweapons, yet we are still surviving. It seems like people with the right knowledge and resources are disinclined to be destructive. Or maybe there are mechanisms that ensure such people don’t succeed. What makes AI different? Won’t the people with the knowledge and resources to build GAI also be more cautious when doing the work, because they are more aware of the dangers of powerful technology?
The history of the world would be different (and a touch shorter) if immediately after the development of the nuclear bomb millions of nuclear armed missiles constructed themselves and launched themselves at targets across the globe.
To date we haven’t invented anything that’s an existential threat without humans intentionally trying to use it as a weapon and devoting their own resources to making it happen. I think that AI is pretty different.
We have dangerous knowledge like nuclear weapons or bioweapons, yet we are still surviving. It seems like people with the right knowledge and resources are disinclined to be destructive. Or maybe there are mechanisms that ensure such people don’t succeed. What makes AI different? Won’t the people with the knowledge and resources to build GAI also be more cautious when doing the work, because they are more aware of the dangers of powerful technology?
The history of the world would be different (and a touch shorter) if immediately after the development of the nuclear bomb millions of nuclear armed missiles constructed themselves and launched themselves at targets across the globe.
To date we haven’t invented anything that’s an existential threat without humans intentionally trying to use it as a weapon and devoting their own resources to making it happen. I think that AI is pretty different.