Isn’t there a leverage penalty built into a Kolmogorov complexity prior if the bit string you’re trying to generate is a particular agent’s sense data? Because the more stuff there is, the more bits are required to locate that agent? And does this solve a problem with just using normal anthropics, where the leverage penalty doesn’t help a paperclip maximizer deal with the potential universe that is just it and a bunch of paperclips that could be destroyed, because paperclips aren’t anthropic reasoners?
Isn’t there a leverage penalty built into a Kolmogorov complexity prior if the bit string you’re trying to generate is a particular agent’s sense data? Because the more stuff there is, the more bits are required to locate that agent? And does this solve a problem with just using normal anthropics, where the leverage penalty doesn’t help a paperclip maximizer deal with the potential universe that is just it and a bunch of paperclips that could be destroyed, because paperclips aren’t anthropic reasoners?