Something to do with it being not being theoretically possible to create an AI without teaching it to think through interaction. (I mean… what? Identify the thing that is an AI after it has been taught to think then combine bits of matter in such a way that you have that AI. Basic physical reductionism!)
Well, one could make a computational complexity argument that there is no way to ” Identify the thing that is an AI after it has been taught to think” other then actually interacting with it.
Sure, but once you do so, then you can build another that you didn’t interact with.
On the other hand, how much variation do you need to introduce before you can declare that the second copy is a different intelligence than the one that you copied it from? And how sure can you be that it’s still an AI after this variation? So there’s an argument to be made there, although I’m far from convinced for now.
Well, one could make a computational complexity argument that there is no way to ” Identify the thing that is an AI after it has been taught to think” other then actually interacting with it.
Sure, but once you do so, then you can build another that you didn’t interact with.
On the other hand, how much variation do you need to introduce before you can declare that the second copy is a different intelligence than the one that you copied it from? And how sure can you be that it’s still an AI after this variation? So there’s an argument to be made there, although I’m far from convinced for now.