It also depends on the implied probability curve of other things you might specify and the precision you intend to convey. There’s no way to distinguish between integers up to and including that one in 30 bits.
Oh, and that’s only a counting of identical/fungible things. Specifying the contents of that many variants is HUGE.
Yes, but I don’t think that’s relevant. Any use of complexity depends on the language you specify it in. If you object to what I’ve said here on those grounds, you have to throw out Solomonoff, Kolmogorov, etc.
More specifically, it seems that your c must include information about how to interpret the X bits. Right? So it seems slightly wrong to say “R is the largest number that can be specified in X bits of information” as long as c stays fixed. c might grow as the specification scheme changes.
Alternatively, you might just be wrong in thinking that 30 bits are enough to specify 3^^^^3. If c indicates that the number of additional universes is specified by a standard binary-encoded number, 30 bits only gets you about a billion.
“It takes less than 30 bits to specify 3^^^^3, no?”
That depends on the language you specify it in.
It also depends on the implied probability curve of other things you might specify and the precision you intend to convey. There’s no way to distinguish between integers up to and including that one in 30 bits.
Oh, and that’s only a counting of identical/fungible things. Specifying the contents of that many variants is HUGE.
Yes, but I don’t think that’s relevant. Any use of complexity depends on the language you specify it in. If you object to what I’ve said here on those grounds, you have to throw out Solomonoff, Kolmogorov, etc.
More specifically, it seems that your c must include information about how to interpret the X bits. Right? So it seems slightly wrong to say “R is the largest number that can be specified in X bits of information” as long as c stays fixed. c might grow as the specification scheme changes.
Alternatively, you might just be wrong in thinking that 30 bits are enough to specify 3^^^^3. If c indicates that the number of additional universes is specified by a standard binary-encoded number, 30 bits only gets you about a billion.
It only takes less than 30 bits if your language supports the ^^^^ notation and that’s not standard notation.
True. So maybe this only works in the long run, once we have more than 30 bits to work with.