Anyway, the point is to find out if a transhuman AI would mind-control the operator into letting it out. Eliezer is smart, but is no transhuman (yet). If he got out, then any strong AI will.
Minor emendation: replace “would”/”will” above with “could (and for most non-Friendly goal systems, would)”.
Minor emendation: replace “would”/”will” above with “could (and for most non-Friendly goal systems, would)”.