IME in this domain, in epistemology-for-humans terms that may or may not translate easily into Solomonoff/MDL, taking a compact generator of a complex phenomenon too seriously — like, concentrating probability too strongly on its predictions, not taking anomalies seriously enough or looking hard enough for them, insufficiently expecting there to be more to say that sounds different in kind — is asking to be wrong, and not doing that is a way to be less wrong.
(Looking for compact generators but not taking them too seriously is good, but empirically seems to require more skill or experience.)
You don’t need to single out a specific complex theory to say “this simple theory is concentrating probability too strongly”, or to expect there to be some complex theory that pays for itself.
IME in this domain, in epistemology-for-humans terms that may or may not translate easily into Solomonoff/MDL, taking a compact generator of a complex phenomenon too seriously — like, concentrating probability too strongly on its predictions, not taking anomalies seriously enough or looking hard enough for them, insufficiently expecting there to be more to say that sounds different in kind — is asking to be wrong, and not doing that is a way to be less wrong.
(Looking for compact generators but not taking them too seriously is good, but empirically seems to require more skill or experience.)
You don’t need to single out a specific complex theory to say “this simple theory is concentrating probability too strongly”, or to expect there to be some complex theory that pays for itself.