Estimating the kolmogorov complexity of the known laws of physics?

In the post Complexity and Intelligence, Eliezer says that the Kolmogorov Complexity (length of shortest equivalent computer program) of the laws of physics is about 500 bits:

Suppose you ran a Turing machine with unlimited tape, so that, starting from our laws of physics, it simulated our whole universe—not just the region of space we see around us, but all regions of space and all quantum branches. [...]

Then the “Kolmogorov complexity” of that entire universe [...] would be 500 bits, or whatever the size of the true laws of physics when written out as equations on a sheet of paper.

Where did this 500 come from?

I googled around for estimates on the Kolmogorov Complexity of the laws of physics, but didn’t find anything. Certainly nothing as concrete as 500.

I asked about it on the physics stack exchange, but haven’t received any answers as of yet.

I considered estimating it myself, but doing that well involves significant time investment. I’d need to learn the standard model well enough to write a computer program that simulated it (however inefficiently or intractably, it’s the program length that matters not it’s time or memory performance).

Based on my experience programming, I’m sure it wouldn’t take a million bits. Probably less than ten thousand. The demo scene does some pretty amazing things with 4096 bits. But 500 sounds like a teeny tiny amount to mention off hand for fitting the constants, the forces, the particles, and the mathematical framework for doing things like differential equations. The fundamental constants alone are going to consume ~20-30 bits each.

Does anyone have a reference, or even a more worked-through example of an estimate?