No it has not. The algorithm for copying the first GB forever is small and the Kolmogorov’s complexity is just over 1GB.
For the entire sequence.
Yes, but the predictor’s complexity is much lower than 1GB.
The paper also gives an example of a single predictor that can learn to predict any eventually periodic sequence, no matter how long the period.
Predictor should remember what happened. It has learned. Now it’s 1 GB heavy.
It looks like you just dislike the definitions in the paper and want to replace them with your own. I’m not sure there’s any point in arguing about that.
I only stick with the Kolmogorov’s definition.
No it has not. The algorithm for copying the first GB forever is small and the Kolmogorov’s complexity is just over 1GB.
For the entire sequence.
Yes, but the predictor’s complexity is much lower than 1GB.
The paper also gives an example of a single predictor that can learn to predict any eventually periodic sequence, no matter how long the period.
Predictor should remember what happened. It has learned. Now it’s 1 GB heavy.
It looks like you just dislike the definitions in the paper and want to replace them with your own. I’m not sure there’s any point in arguing about that.
I only stick with the Kolmogorov’s definition.