At the heart of this question is some concept of resource permission that I’m trying to nail down—that is, agent X has ‘self-modified’ into agent Y iff agent Y has the same hardware resources that agent X had. This distinguishes self-modification from emulation, which is important; humans have limited self-modification, but with a long paper tape we can emulate any program.
A proposed measure: Define the ‘emulation penalty’ of a program that could execute on the AI’s machine as the ratio of the runtime of the AI’s fastest possible emulation of that program to the runtime of the program executing directly on the machine. The maximum emulation penalty over all possible programs puts at least an lower bound on the AI’s ability to effectively self-modify into any possible agent.
An AI that can write and exec assembly would have a max emulation penalty of 1; one that can write and exec a higher-level language would probably have 10-100 (I think?); and one that could only carry out general computation by using an external paper tape would have a max emulation penalty in the billions or higher.
At the heart of this question is some concept of resource permission that I’m trying to nail down—that is, agent X has ‘self-modified’ into agent Y iff agent Y has the same hardware resources that agent X had. This distinguishes self-modification from emulation, which is important; humans have limited self-modification, but with a long paper tape we can emulate any program.
A proposed measure: Define the ‘emulation penalty’ of a program that could execute on the AI’s machine as the ratio of the runtime of the AI’s fastest possible emulation of that program to the runtime of the program executing directly on the machine. The maximum emulation penalty over all possible programs puts at least an lower bound on the AI’s ability to effectively self-modify into any possible agent.
An AI that can write and exec assembly would have a max emulation penalty of 1; one that can write and exec a higher-level language would probably have 10-100 (I think?); and one that could only carry out general computation by using an external paper tape would have a max emulation penalty in the billions or higher.
Therefore, for a computer in Greg Egan’s Permutation City, emulation is self-modification?