This creates a recursive loop such that each of them experiences what it is like to experience being them experiencing what it is like to be the other, on and on to whatever degree is desired by either of them.
Why should this be the case? When I encounter a potentially hostile piece of programming, I don’t run it on my main computer. I run it in a carefully isolated sandbox until I’ve extracted whatever data or value I need from that program. Then I shut down the sandbox. If the AI is superintelligent enough to scan human minds as its taking humans apart (and why should it do that?), what prevents it from creating a similar isolated environment to keep any errant human consciousnesses away from its vital paper-clip optimizing computational resources?
Why should this be the case? When I encounter a potentially hostile piece of programming, I don’t run it on my main computer. I run it in a carefully isolated sandbox until I’ve extracted whatever data or value I need from that program. Then I shut down the sandbox. If the AI is superintelligent enough to scan human minds as its taking humans apart (and why should it do that?), what prevents it from creating a similar isolated environment to keep any errant human consciousnesses away from its vital paper-clip optimizing computational resources?
I don’t see why it wouldn’t be able to do so. I assume that when it does this, it can “pull out” safely.