I am a bit confused here and I would appreciate your thoughts!
Do you want to assume T∗ finite or not? Either way I am confused:
1.T∗ is finite In this case, the notion of almost all/almost surely is vacuous. Anything which is true up to a finite set is true if your initial measure space has finite cardinality itself.
II. T∗ is infinite While there is no immediate problem, I think your condition that for almost all ¯s∈T∗, we want P(ϕ(¯s)=s)≤1−ε for any s∈T becomes too strong I believe for a reasonable simulator. Let S(x) mean a sufficient amount of repetitions of the sequence x∈T∗. Consider the set X=∪n∈N{S(x)|x∈Tn}, where S(x) means a sufficient amount of repetitions of x (I am being intentionnaly vague here, but I hope you get the idea). I have not empirically verified it, but it seems like P(x|S(x)) might grow, i.e. the more often you repeat a string, the more likely it is that it will repeat itself. And I think X is uncountable, so any reasonable measure should assign something greater than 0 to it.
I think it is also worth mentioning that parts of this post reminded me of concepts introduced information theory. In fact if you go back to Shannon’s seminal A Mathematical Theory of Communication the second section already anticipates something like this (and then for example higher temperature=more noise?). It could be though that your post is more orthogonal to it.
I am a bit confused here and I would appreciate your thoughts!
Do you want to assume T∗ finite or not? Either way I am confused:
1.T∗ is finite
In this case, the notion of almost all/almost surely is vacuous. Anything which is true up to a finite set is true if your initial measure space has finite cardinality itself.
II. T∗ is infinite
While there is no immediate problem, I think your condition that for almost all ¯s∈T∗, we want P(ϕ(¯s)=s)≤1−ε for any s∈T becomes too strong I believe for a reasonable simulator.
Let S(x) mean a sufficient amount of repetitions of the sequence x∈T∗. Consider the set X=∪n∈N{S(x)|x∈Tn}, where S(x) means a sufficient amount of repetitions of x (I am being intentionnaly vague here, but I hope you get the idea). I have not empirically verified it, but it seems like P(x|S(x)) might grow, i.e. the more often you repeat a string, the more likely it is that it will repeat itself. And I think X is uncountable, so any reasonable measure should assign something greater than 0 to it.
I think it is also worth mentioning that parts of this post reminded me of concepts introduced information theory. In fact if you go back to Shannon’s seminal A Mathematical Theory of Communication the second section already anticipates something like this (and then for example higher temperature=more noise?). It could be though that your post is more orthogonal to it.