If you try to rule out a specific class of ways the AI could modify the programmer, the AI has a motive to superintelligently seek out loopholes and ways to modify the programmer indirectly.
There’s a sci-fi story there about an AI that always follows orders manipulating everyone into ordering it to do something horrible.
Of course the humans would realize this at the last minute and stop themselves.
If you try to rule out a specific class of ways the AI could modify the programmer, the AI has a motive to superintelligently seek out loopholes and ways to modify the programmer indirectly.
There’s a sci-fi story there about an AI that always follows orders manipulating everyone into ordering it to do something horrible.
Of course the humans would realize this at the last minute and stop themselves.
Were you thinking of All The Troubles Of The World by Isaac Asimov?