I’m hearing words that make me think of the Axiom of Choice. Are you saying that you suspect a formal solution to the sleeping beauty problem has different answers depending on whether or not we have this axiom at our disposal?
At −3 Karma I’m not too eager to make that effort. But a bit more details might be helpful.
If the voting seems honestly not justified, take it as a sign that you’re dealing with too great an inferential distance. If you can’t tell whether the down voting is justified, ask for feedback that contains a URL so you can learn whatever it is that is so obvious to everyone else :-)
I’m hearing words that make me think of the Axiom of Choice. Are you saying that you suspect a formal solution to the sleeping beauty problem has different answers depending on whether or not we have this axiom at our disposal?
If the voting seems honestly not justified, take it as a sign that you’re dealing with too great an inferential distance. If you can’t tell whether the down voting is justified, ask for feedback that contains a URL so you can learn whatever it is that is so obvious to everyone else :-)
I had a similar feeling, but if there are finite examples of the phenomenon, then AC is unlikely to be significant.