Is it possible to assign a non-zero prior probability to statements like “my memory has been altered”, “I am suffering from delusions”, and “I live in a perfectly simulated matrix”?
Of course we have to assign non-zero probabilities to them, but I’m not quite sure how we’d figure out the right priors. Assuming that the hypotheses that your memory has been altered or you’re delusional do not actually cause you to anticipate anything differently (see the bit about the blue tentacle in Technical Explanation), you may as well live in whatever reality appears to you to be the outermost one accessible to your mind.
(As for the last one, Nick Bostrom argues that we can actually assign a very high probability to a statement somewhat similar to “I live in a perfectly simulated matrix” — see the Simulation Argument. I have doubts about the meaningfulness of that on the basis of modal realism, but I’m not too confident one way or the other.)
I disagree with the idea that modal realism, whether right or not, changes the chances of any particular hypothesis like that being true. I am not saying that we can never have a rational belief about whether or not modal realism is true: There may or may not be a philosophical justification for modal realism. However, I do think that whether modal realism applies has no bearing on the probability of you being in some situation, such as in a computer simulation. I think this issue needs debating, so for that purpose I have asserted this is a rule, which I call “The Principle of Modal Realism Equivalence”, and that gives us something well-defined to argue for or against. I define and assert the rule, and give a (short) justification of it here:
http://www.paul-almond.com/ModalRealismEquivalence.pdf.
But what if you should anticipate things very differently, if your memory has been altered? If I assigned a high probability to my memory having been altered, then I should expect that the technology exists to alter memories, and all manner of even stranger things that that would imply. Figuring out what prior to assign to a case like that, or whether it can be done at all, is what I’m struggling with.
Of course we have to assign non-zero probabilities to them, but I’m not quite sure how we’d figure out the right priors. Assuming that the hypotheses that your memory has been altered or you’re delusional do not actually cause you to anticipate anything differently (see the bit about the blue tentacle in Technical Explanation), you may as well live in whatever reality appears to you to be the outermost one accessible to your mind.
(As for the last one, Nick Bostrom argues that we can actually assign a very high probability to a statement somewhat similar to “I live in a perfectly simulated matrix” — see the Simulation Argument. I have doubts about the meaningfulness of that on the basis of modal realism, but I’m not too confident one way or the other.)
I disagree with the idea that modal realism, whether right or not, changes the chances of any particular hypothesis like that being true. I am not saying that we can never have a rational belief about whether or not modal realism is true: There may or may not be a philosophical justification for modal realism. However, I do think that whether modal realism applies has no bearing on the probability of you being in some situation, such as in a computer simulation. I think this issue needs debating, so for that purpose I have asserted this is a rule, which I call “The Principle of Modal Realism Equivalence”, and that gives us something well-defined to argue for or against. I define and assert the rule, and give a (short) justification of it here: http://www.paul-almond.com/ModalRealismEquivalence.pdf.
But what if you should anticipate things very differently, if your memory has been altered? If I assigned a high probability to my memory having been altered, then I should expect that the technology exists to alter memories, and all manner of even stranger things that that would imply. Figuring out what prior to assign to a case like that, or whether it can be done at all, is what I’m struggling with.
It’s not actually all that hard to mess with memories.