Question: how low exactly (approximately) would a Solomonoff prior’s probability for something like “Coming up with this thought means that you automatically [get an FAI recipe in the mail] / [go to super-fun heaven] / [get to live for 3^^^^3 happy years].” be? (Those specific examples are fine, but if you can come up with something of similarly vast utility that’s more likely, that’d be even better.)
Is it something on the order of 1/10^40 or 1/10^100 or 1/10^200 or less? (Sorry for anchoring you.)
(I’m thinking of an alternative Pascal’s Mugging “solution” to which this is relevant.)
Question: how low exactly (approximately) would a Solomonoff prior’s probability for something like “Coming up with this thought means that you automatically [get an FAI recipe in the mail] / [go to super-fun heaven] / [get to live for 3^^^^3 happy years].” be? (Those specific examples are fine, but if you can come up with something of similarly vast utility that’s more likely, that’d be even better.)
Is it something on the order of 1/10^40 or 1/10^100 or 1/10^200 or less? (Sorry for anchoring you.)
(I’m thinking of an alternative Pascal’s Mugging “solution” to which this is relevant.)