When it comes to AI, tho, I think the right analogy is not “individual human vs. individual AI” but “human civilization (with computers)” vs. “AI civilization”. It seems pretty clear to me that most of the things to worry about with AGI look like it getting the ability to do ‘cultural accumulation’, at least with regards to technological knowledge, and aren’t related to “it has a much higher working memory!” or other sorts of individual intelligence tasks. In this lens, the superiority of artificial intelligence is mostly cultural superiority.
I agree with this, but I’m mostly interested in this question because the abilities of AI culture depend quite heavily on the abilities of individual AI systems.
I don’t see why this is a case for AI systems in a different way then it is for humans.
The fact that an AI can easily clone specific instances of it makes it much faster for it to spread culture. We humans can’t simply make 100,000 copies of Elon Musk but if Elon would be an AI that would be easy.
I agree with this, but I’m mostly interested in this question because the abilities of AI culture depend quite heavily on the abilities of individual AI systems.
I don’t see why this is a case for AI systems in a different way then it is for humans.
The fact that an AI can easily clone specific instances of it makes it much faster for it to spread culture. We humans can’t simply make 100,000 copies of Elon Musk but if Elon would be an AI that would be easy.