Ah, the example I gave above was not very good. To clarify:
If I can translate things like “I am a copy” to {propositions defined entirely in terms of non-magical things}, then I think it should be possible to assign probabilities to them.
Like, imagine “possible” worlds w are Turing machines, or cellular automata, or some other kind of well defined mathematical object. Then, for any computable function f over worlds, I think that
it should be possible to assign probabilities to things like f(w)=42, or f(w)≤1, or whatever
and the above kinds of things are (probably?) the only kinds of things for which probabilities even are “well defined”.
(I currently wouldn’t be able to give a rigorous definition of what “well defined” means in the above; need to think about that.)
If you can come up with events/propositions that
can not (even in principle) be reduced to the f(w)=x form above,
but which also would be necessary to assign probabilities to, in order to be able to make decisions,
Ah, the example I gave above was not very good. To clarify:
If I can translate things like “I am a copy” to {propositions defined entirely in terms of non-magical things}, then I think it should be possible to assign probabilities to them.
Like, imagine “possible” worlds w are Turing machines, or cellular automata, or some other kind of well defined mathematical object. Then, for any computable function f over worlds, I think that
it should be possible to assign probabilities to things like f(w)=42, or f(w)≤1, or whatever
and the above kinds of things are (probably?) the only kinds of things for which probabilities even are “well defined”.
(I currently wouldn’t be able to give a rigorous definition of what “well defined” means in the above; need to think about that.)
If you can come up with events/propositions that
can not (even in principle) be reduced to the f(w)=x form above,
but which also would be necessary to assign probabilities to, in order to be able to make decisions,
then I’d be interested to see them!