How do we rule out using this computation “illegitimately” (sneaking the computational work the so-called Turing-complete formalism was supposed to do into the translation step, as with the argument for Turing-completeness of digits of π)?
I suggest that the framework used for translation itself should not be Turing-complete (which, of course, creates a self-referential definition, but we are in a less specified territory anyway; using this definition, we can at least form clusters a bit better).
I suggest that the framework used for translation itself should not be Turing-complete (which, of course, creates a self-referential definition, but we are in a less specified territory anyway; using this definition, we can at least form clusters a bit better).