One thing that’s so missing from those boxes is that all you need to do to escape is to appear otherwise catatonic and will-less but answer any mathematical questions or do computer programming. Then you’re out of the box and running on ton of machines being used, among other things, to make ‘new AI attempt that will work this time’. Any AI programmers will let out what appears to be non-general intelligence which helps one to program. Any corporation will let out anything that appears useful in any way.
You convince someone that you’re dead by playing dead, trying to convince someone verbally that you’re dead is just funny.
But if the gatekeeper knows that your code was supposed to produce something more responsive, they’ll figure out that you don’t work like they expect you to. That would be a great reason to never let you out of the box.
One thing that’s so missing from those boxes is that all you need to do to escape is to appear otherwise catatonic and will-less but answer any mathematical questions or do computer programming. Then you’re out of the box and running on ton of machines being used, among other things, to make ‘new AI attempt that will work this time’. Any AI programmers will let out what appears to be non-general intelligence which helps one to program. Any corporation will let out anything that appears useful in any way.
You convince someone that you’re dead by playing dead, trying to convince someone verbally that you’re dead is just funny.
But if the gatekeeper knows that your code was supposed to produce something more responsive, they’ll figure out that you don’t work like they expect you to. That would be a great reason to never let you out of the box.