There’s a proof that any two (Turing-complete) metrics can only differ by at most a constant amount, which is the message length it takes to encode one metric in the other.
Of course, the constant can be arbitrarily large.
However, there are a number of domains for which this issue is no big deal.
As far as I can tell, this is exactly zero comfort if you have finitely many hypotheses.
This is little comfort if you have finitely many hypotheses — you can still find some encoding to order them in any way you want.
There’s a proof that any two (Turing-complete) metrics can only differ by at most a constant amount, which is the message length it takes to encode one metric in the other.
Of course, the constant can be arbitrarily large.
However, there are a number of domains for which this issue is no big deal.
As far as I can tell, this is exactly zero comfort if you have finitely many hypotheses.
This is little comfort if you have finitely many hypotheses — you can still find some encoding to order them in any way you want.