The fundamental constants alone are going to consume ~20-30 bits each.
Not if those constants are free variables, or can computed in some way we don’t understand. Frankly, I’d find it a bit ugly if those constants were hard-coded into the fabric of the universe.
Well, several of the universal constants arguably define our units. For every base type of physical quantity (things like distance, time, temperature, and mass, but not, for example, speed, which can be constructed out of distance and time), you can set a physical constant to 1 if you’re willing to change how you measure that property. For example, you can express distance in terms of time (measuring distance in light-seconds or light-years). By doing so, you can discard the speed of light: set it to 1. Speeds are now ratios of time to time: something moving at 30% the speed of light would move 0.3 (light) seconds per second: their speed would be the dimensionless quantity 0.3. You can drop many other physical constants in this fashion: Offhead, the speed of light, the gravitational constant, planks constant, the coulomb constant, and the Boltzmann constant can all be set to 1 without any trouble, and therefore don’t count against your complexity budget.
First not: I’m not disagreeing with you so much as just giving more information.
This might buy you a few bits (and lots of high energy physics is done this way, with powers of electronvolts the only units here). But there will still be free variables that need to be set. Wikipedia claims (with a citation to this John Baez post) that there are 26 fundamental dimensionless physical constants. These, as far as we know right now, have to be hard coded in somewhere, maybe in units, maybe in equations, but somewhere.
As a reference for anyone encountering this discussion, I thought I’d mention Natural Units explicitly. Basically, they are the systems of units that particle physicists use. They are attempts to normalize out as many fundamental constants as possible, exactly as you discuss.
Unfortunately, you can’t build a system that gets them all. You are always left with some fraction over pi, or the square root of the fine-structure constant, or something.
I agree that the constants might be related in ways we don’t know, which would allow compression. I’m more interested in an upper bound on the complexity than an exact value (which is likely incomputable for halting problem reasons), so I’m willing to be over by 100 bits because we fail to see a pattern.
As far variable constants: Sure, we can estimate the kolmogorov complexity where the “constants” are inputs. Or we can estimate the kolmogorov complexity of the current laws plus the state 13.7 billion years ago at the big bang. Or we can estimate the complexity of a program that runs all programs. All of these questions are interesting. But right now I want the answer to the one I asked, not the others.
Not if those constants are free variables, or can computed in some way we don’t understand. Frankly, I’d find it a bit ugly if those constants were hard-coded into the fabric of the universe.
Well, several of the universal constants arguably define our units. For every base type of physical quantity (things like distance, time, temperature, and mass, but not, for example, speed, which can be constructed out of distance and time), you can set a physical constant to 1 if you’re willing to change how you measure that property. For example, you can express distance in terms of time (measuring distance in light-seconds or light-years). By doing so, you can discard the speed of light: set it to 1. Speeds are now ratios of time to time: something moving at 30% the speed of light would move 0.3 (light) seconds per second: their speed would be the dimensionless quantity 0.3. You can drop many other physical constants in this fashion: Offhead, the speed of light, the gravitational constant, planks constant, the coulomb constant, and the Boltzmann constant can all be set to 1 without any trouble, and therefore don’t count against your complexity budget.
First not: I’m not disagreeing with you so much as just giving more information.
This might buy you a few bits (and lots of high energy physics is done this way, with powers of electronvolts the only units here). But there will still be free variables that need to be set. Wikipedia claims (with a citation to this John Baez post) that there are 26 fundamental dimensionless physical constants. These, as far as we know right now, have to be hard coded in somewhere, maybe in units, maybe in equations, but somewhere.
As a reference for anyone encountering this discussion, I thought I’d mention Natural Units explicitly. Basically, they are the systems of units that particle physicists use. They are attempts to normalize out as many fundamental constants as possible, exactly as you discuss.
Unfortunately, you can’t build a system that gets them all. You are always left with some fraction over pi, or the square root of the fine-structure constant, or something.
I agree that the constants might be related in ways we don’t know, which would allow compression. I’m more interested in an upper bound on the complexity than an exact value (which is likely incomputable for halting problem reasons), so I’m willing to be over by 100 bits because we fail to see a pattern.
As far variable constants: Sure, we can estimate the kolmogorov complexity where the “constants” are inputs. Or we can estimate the kolmogorov complexity of the current laws plus the state 13.7 billion years ago at the big bang. Or we can estimate the complexity of a program that runs all programs. All of these questions are interesting. But right now I want the answer to the one I asked, not the others.
edit clarified response