So one’s prior for the truth of Scientology can’t be anywhere near as low as one would think if one simply assigned an exponentially low probability based on the complexity of the religion.
If nobody had ever proposed Scientology, though, learning Xenu existed wouldn’t increase our probabilities for most other claims that happen to be Scientological. So it seems to me that our prior can be that low (to the extent that Scientological claims are naturally independent of each other), but our posterior conditioning on Scientology having been proposed can’t.
In proportion to the complexity of the claim given that humans exist, which is much lower (=> higher prior) than its complexity in a simple encoding, since Scientology is the sort of thing that a human would be likely to propose.
The prior for “Scientology is proposed” is higher than the simple complexity prior of the claim, to the (considerable) extent that Scientology is the sort of thing a human would make up.
If nobody had ever proposed Scientology, though, learning Xenu existed wouldn’t increase our probabilities for most other claims that happen to be Scientological. So it seems to me that our prior can be that low (to the extent that Scientological claims are naturally independent of each other), but our posterior conditioning on Scientology having been proposed can’t.
Right, because that “Scientology is proposed” has itself an extremely low prior, namely in proportion to the complexity of the claim.
In proportion to the complexity of the claim given that humans exist, which is much lower (=> higher prior) than its complexity in a simple encoding, since Scientology is the sort of thing that a human would be likely to propose.
The prior for “Scientology is proposed” is higher than the simple complexity prior of the claim, to the (considerable) extent that Scientology is the sort of thing a human would make up.