To put it more formally: we want an encoding algorithm E and a decoding algorithm D such that for any natural number n, a turing machine can compute E(n) and D(E(n)) in finite time, but that there exists a k such that “determining a string s of length k so that the set {n such that E(n) starts with s} is infinite” is an undecidable problem. Do D and E exist fulfilling those conditions?
I admit my working knowledge of what is decidable/computable is somewhat limited; maybe the condition should be to find E and D such that no algorithm can return s given k, or whether there exists an algorithm that given E and D, can return a “confusing” sequence of arbitrary length.
I think I have a partial solution for the case where the “infinitely confusing” string is unique. Namely, if there is a unique infinite string S on which D doesn’t terminate, then S is computable.
Here’s an algorithm for computing S by using D. Assume that the first bit of S is 0. That means D terminates on all strings whose first bit is 1. By the argument from the original post, that implies D’s running time on strings starting with 1 is uniformly bounded. (Otherwise we would be able to generate an “infinitely confusing” string for D that starts with 1 by using the proof technique from my post, which can’t happen because S is unique and starts with 0 by assumption.) Therefore, by feeding D all possible finite inputs in turn while allowing it gradually increasing running times (“dovetailing”), we will eventually get a “covering set” that proves D always terminates on strings starting with 1. So we determine that the first bit of S must be 0. In the same way you can determine the second bit of S, and so on.
Let’s see how it works on unary notation. We run D on all finite inputs in turn. First we notice that D(“0”) terminates without trying to read further, therefore the first bit of S is 1. Then we notice that D(“10″) terminates, therefore the second bit of S is also 1. And so on.
To put it more formally: we want an encoding algorithm E and a decoding algorithm D such that for any natural number n, a turing machine can compute E(n) and D(E(n)) in finite time, but that there exists a k such that “determining a string s of length k so that the set {n such that E(n) starts with s} is infinite” is an undecidable problem. Do D and E exist fulfilling those conditions?
I admit my working knowledge of what is decidable/computable is somewhat limited; maybe the condition should be to find E and D such that no algorithm can return s given k, or whether there exists an algorithm that given E and D, can return a “confusing” sequence of arbitrary length.
I asked MathOverflow to solve this and got a complete solution within an hour. Just like last time. Wow, just wow.
The stack exchange sites are pretty impressing, and humbling. Maybe Eliezer should outsource more of his research to them.
I think I have a partial solution for the case where the “infinitely confusing” string is unique. Namely, if there is a unique infinite string S on which D doesn’t terminate, then S is computable.
Here’s an algorithm for computing S by using D. Assume that the first bit of S is 0. That means D terminates on all strings whose first bit is 1. By the argument from the original post, that implies D’s running time on strings starting with 1 is uniformly bounded. (Otherwise we would be able to generate an “infinitely confusing” string for D that starts with 1 by using the proof technique from my post, which can’t happen because S is unique and starts with 0 by assumption.) Therefore, by feeding D all possible finite inputs in turn while allowing it gradually increasing running times (“dovetailing”), we will eventually get a “covering set” that proves D always terminates on strings starting with 1. So we determine that the first bit of S must be 0. In the same way you can determine the second bit of S, and so on.
Let’s see how it works on unary notation. We run D on all finite inputs in turn. First we notice that D(“0”) terminates without trying to read further, therefore the first bit of S is 1. Then we notice that D(“10″) terminates, therefore the second bit of S is also 1. And so on.
That looks true, though the process you describe only works if S is indeed unique, othwerwise it gets stuck.