Being changed by an attacker is only one of the scenarios I was suggesting. And even then, presumably you would want the AI to help prevent them from hacking its utility function if they aren’t supposed to have root access, but it won’t.
Anyway, that problem is just a little bit stupid. But you can also get really stupid problems, like the AI wants more memory, so it replaces its utility function with something more compressible so that it can scavange from the memory where its utility function was stored.
Being changed by an attacker is only one of the scenarios I was suggesting. And even then, presumably you would want the AI to help prevent them from hacking its utility function if they aren’t supposed to have root access, but it won’t.
Anyway, that problem is just a little bit stupid. But you can also get really stupid problems, like the AI wants more memory, so it replaces its utility function with something more compressible so that it can scavange from the memory where its utility function was stored.