I don’t understand that quote. A good Bayesian should still pick the aposteriori most probable explanation for an improbable event, even if that explanation has very low prior probability before the event.
I think it’s more than that—he’s saying that if you have a plausible explanation for an event, the event itself is plausible, explanations being models of the world. It’s a warning against setting up excuses for why your model fails to predict the future in advance—you shouldn’t expect your model to fail, so when it does you don’t say, “Oh, here’s how this extremely surprising event fits my model anyway.” Instead, you say “damn, looks like I was wrong.”
Absolutely: I strongly recommend you not try to explain how 3^^^3 people might all get a dustspeck in their eye without anything else happening as a consequence, for example.
Is Eliezer claiming that we aren’t living in a simulation, claiming that if we are living in a simulation, it’s extremely unlikely to generate wild anomalies, or claiming that anything other than those two is vanishingly unlikely?
Eliezer Yudkowsky
(Some discussions here, such as those involving such numbers as 3^^^3, give me the same feeling.)
I don’t understand that quote. A good Bayesian should still pick the aposteriori most probable explanation for an improbable event, even if that explanation has very low prior probability before the event.
I suspect the point is that it’s not worthwhile to look for potential explanations for improbable events until they actually happen.
I think it’s more than that—he’s saying that if you have a plausible explanation for an event, the event itself is plausible, explanations being models of the world. It’s a warning against setting up excuses for why your model fails to predict the future in advance—you shouldn’t expect your model to fail, so when it does you don’t say, “Oh, here’s how this extremely surprising event fits my model anyway.” Instead, you say “damn, looks like I was wrong.”
I don’t, however, think it’s meant to be a warning against contrived thought experiments.
Absolutely: I strongly recommend you not try to explain how 3^^^3 people might all get a dustspeck in their eye without anything else happening as a consequence, for example.
It’s Yudkowsky. Sorry, pet peeve.
Fixed.
Is Eliezer claiming that we aren’t living in a simulation, claiming that if we are living in a simulation, it’s extremely unlikely to generate wild anomalies, or claiming that anything other than those two is vanishingly unlikely?
Sorry to be so ignorant but what is 3^^^3? Google yielded no satisfactory results…
http://en.wikipedia.org/wiki/Knuth_arrow
TheOtherDave’s other comment summed up what it means practically. Also, see http://lesswrong.com/lw/kn/torture_vs_dust_specks/.
Ah thank you, that clarifies things greatly! Up-voted for the technical explanation.
A number so ridiculously big that 3^^^3 * X can be assumed to be bigger than Y for pretty much any values of X and Y.