“You believe, overwhelmingly, that the probability (in underlying reality, divorced from the map and its limitations) is zero. It is only grudgingly that you inch even a tiny morsel of probability into the other hypothesis (that the universe is structured in such a way as to make the probability non-zero).”
There is a definitive, non-zero probability that, when throwing a tennis ball at a wall, it will simply “quantum tunnel” through it and fall out the other side. There’s a definitive, non-zero probability that random chance will leave all the air molecules in the room on the east side, suffocating anyone caught on the west.
It’s worth noting that larger effects are exponentially less likely—just like getting heads once is 50%, but twice is only 25%, and three times is 12.5%.
Therefor, there’s a minimum computable plausibility for pretty much any claim. Your prior should never be zero, and (if you are a strict rationalist AI) you should presumably never want it to be zero.
(Your further point about metaconfidence was actually covered by Givewell on the site previously. It’s also worth noting that what I’ve said here doesn’t prevent muggings at all, it just establishes that Pascal’s Muggings has a definitive non-zero probability by our current understanding of the universe)
“You believe, overwhelmingly, that the probability (in underlying reality, divorced from the map and its limitations) is zero. It is only grudgingly that you inch even a tiny morsel of probability into the other hypothesis (that the universe is structured in such a way as to make the probability non-zero).”
There is a definitive, non-zero probability that, when throwing a tennis ball at a wall, it will simply “quantum tunnel” through it and fall out the other side. There’s a definitive, non-zero probability that random chance will leave all the air molecules in the room on the east side, suffocating anyone caught on the west.
It’s worth noting that larger effects are exponentially less likely—just like getting heads once is 50%, but twice is only 25%, and three times is 12.5%.
Therefor, there’s a minimum computable plausibility for pretty much any claim. Your prior should never be zero, and (if you are a strict rationalist AI) you should presumably never want it to be zero.
(Your further point about metaconfidence was actually covered by Givewell on the site previously. It’s also worth noting that what I’ve said here doesn’t prevent muggings at all, it just establishes that Pascal’s Muggings has a definitive non-zero probability by our current understanding of the universe)