K-complexity isn’t really a matter of scale. Give me a program, and I can design a Turing machine that can implement it in one symbol.
For any two given Turing machines, you can find some constant so that the K-complexity of a program in terms of each Turing machine is within that constant, but it’s not like they’re off by that constant exactly. In fact, it’s impossible to do that.
Also, he gave two reasons. You only talked about the first.
Yeah, I agree that K-complexity is annoyingly relative. If there were something more absolute that could do the same job, I’d adopt it without a second thought, because it would be more “true” and less “fake” :-) And I feel the same way about Bayesian priors, for similar reasons.
K-complexity isn’t really a matter of scale. Give me a program, and I can design a Turing machine that can implement it in one symbol.
For any two given Turing machines, you can find some constant so that the K-complexity of a program in terms of each Turing machine is within that constant, but it’s not like they’re off by that constant exactly. In fact, it’s impossible to do that.
Also, he gave two reasons. You only talked about the first.
Yeah, I agree that K-complexity is annoyingly relative. If there were something more absolute that could do the same job, I’d adopt it without a second thought, because it would be more “true” and less “fake” :-) And I feel the same way about Bayesian priors, for similar reasons.