As I said before about skeptical scenarios: you cannot refute them by argument, by definition, because the person arguing for the skeptical scenario will say, “since you are in this skeptical scenario, your argument is wrong no matter how convincing it seems to you.”
But we do not believe those scenarios, and that includes the Boltzmann Brain theory, because they are not useful for any purpose. In other words, if you are a Boltzmann Brain, you have no idea what would be good to do, and in fact according to the theory you cannot do anything because you will not exist one second from now.
I don’t think that’s descriptively true at all. Regardless of whether or not I see a useful way to address it, I still wouldn’t expect to dissolve momentarily with no warning.
Now, this may be because humans can’t easily believe in novel claims. But “my” experience certainly seems more coherent than one would expect a BB’s to seem, and this calls out for explanation.
A Boltzmann brain has no way to know anything, reason to any conclusion, or whatever. So it has no way to know whether its experience should seem coherent or not. So your claim that this needs explanation is an unjustified assumption, if you are a Boltzmann brain.
Which conclusion? I believe that a Boltzmann brain cannot validly believe or reason about anything, and I certainly believe that I am not a Boltzmann brain.
More importantly, I believe everything I said there.
Seems like you’re using a confusing definition of “believe”, but the point is that I disagree about our reasons for rejecting the claim that you’re a BB.
Note that according to your reasoning, any theory which says you’re a BB must give us a uniform distribution for all possible experiences. So rationally coming to assign high probability to that theory seems nearly impossible if your experience is not actually random.
My reason for rejecting the claim of BB is that the claim is useless—and I am quite sure that is my reason. I would definitely reject it for that reason even if I had an argument that seemed extremely convincing to me that there is a 95% chance I am a BB.
A theory that says I am a BB cannot assign a probability to anything, not even by giving a uniform distribution. A BB theory is like a theory that says, “you are always wrong.” You cannot get any probability assignments from that, since as soon as you bring them up, the theory will say your assignments are wrong. In a similar way, a BB theory implies that you have never learned or studied probability theory. So you do not know whether probabilities should sum to 100% (or to any similar normalized result) or anything else about probability theory.
As I said, BB theory is useless—and part of its uselessness is that it cannot imply any conclusions, not even any kind of prior over your experiences.
I’m using probability to represent personal uncertainty, and I am not a BB. So I think I can legitimately assign the theory a distribution to represent uncertainty, even if believing the theory would make me more uncertain than that. (Note that if we try to include radical logical uncertainty in the distribution, it’s hard to argue the numbers would change. If a uniform distribution “is wrong,” how would I know what I should be assigning high probability to?)
I don’t think you assign a 95% chance to being a BB, or even that you could do so without severe mental illness. Because for starters:
Humans who really believe their actions mean nothing don’t say, “I’ll just pretend that isn’t so.” They stop functioning. Perhaps you meant the bar is literally 5% for meaningful action, and if you thought it was 0.1% you’d stop typing?
I would agree if you’d said that evolution hardwired certain premises or approximate priors into us ‘because it was useful’ to evolution. I do not believe that humans can use the sort of pascalian reasoning you claim to use here, not when the issue is BB or not BB. Nor do I believe it is in any way necessary. (Also, the link doesn’t make this clear, but a true prior would need to include conditional probabilities under all theories being considered. Humans, too, start life with a sketch of conditional probabilities.)
META: I made a comment in discussion about the article and add there my consideration why it is not bad to be BB, may be we could move discussion there?
As I said before about skeptical scenarios: you cannot refute them by argument, by definition, because the person arguing for the skeptical scenario will say, “since you are in this skeptical scenario, your argument is wrong no matter how convincing it seems to you.”
But we do not believe those scenarios, and that includes the Boltzmann Brain theory, because they are not useful for any purpose. In other words, if you are a Boltzmann Brain, you have no idea what would be good to do, and in fact according to the theory you cannot do anything because you will not exist one second from now.
I don’t think that’s descriptively true at all. Regardless of whether or not I see a useful way to address it, I still wouldn’t expect to dissolve momentarily with no warning.
Now, this may be because humans can’t easily believe in novel claims. But “my” experience certainly seems more coherent than one would expect a BB’s to seem, and this calls out for explanation.
A Boltzmann brain has no way to know anything, reason to any conclusion, or whatever. So it has no way to know whether its experience should seem coherent or not. So your claim that this needs explanation is an unjustified assumption, if you are a Boltzmann brain.
One man’s modus ponens is another man’s modus tollens. I don’t even believe that you believe the conclusion.
Which conclusion? I believe that a Boltzmann brain cannot validly believe or reason about anything, and I certainly believe that I am not a Boltzmann brain.
More importantly, I believe everything I said there.
Seems like you’re using a confusing definition of “believe”, but the point is that I disagree about our reasons for rejecting the claim that you’re a BB.
Note that according to your reasoning, any theory which says you’re a BB must give us a uniform distribution for all possible experiences. So rationally coming to assign high probability to that theory seems nearly impossible if your experience is not actually random.
My reason for rejecting the claim of BB is that the claim is useless—and I am quite sure that is my reason. I would definitely reject it for that reason even if I had an argument that seemed extremely convincing to me that there is a 95% chance I am a BB.
A theory that says I am a BB cannot assign a probability to anything, not even by giving a uniform distribution. A BB theory is like a theory that says, “you are always wrong.” You cannot get any probability assignments from that, since as soon as you bring them up, the theory will say your assignments are wrong. In a similar way, a BB theory implies that you have never learned or studied probability theory. So you do not know whether probabilities should sum to 100% (or to any similar normalized result) or anything else about probability theory.
As I said, BB theory is useless—and part of its uselessness is that it cannot imply any conclusions, not even any kind of prior over your experiences.
I’m using probability to represent personal uncertainty, and I am not a BB. So I think I can legitimately assign the theory a distribution to represent uncertainty, even if believing the theory would make me more uncertain than that. (Note that if we try to include radical logical uncertainty in the distribution, it’s hard to argue the numbers would change. If a uniform distribution “is wrong,” how would I know what I should be assigning high probability to?)
I don’t think you assign a 95% chance to being a BB, or even that you could do so without severe mental illness. Because for starters:
Humans who really believe their actions mean nothing don’t say, “I’ll just pretend that isn’t so.” They stop functioning. Perhaps you meant the bar is literally 5% for meaningful action, and if you thought it was 0.1% you’d stop typing?
I would agree if you’d said that evolution hardwired certain premises or approximate priors into us ‘because it was useful’ to evolution. I do not believe that humans can use the sort of pascalian reasoning you claim to use here, not when the issue is BB or not BB. Nor do I believe it is in any way necessary. (Also, the link doesn’t make this clear, but a true prior would need to include conditional probabilities under all theories being considered. Humans, too, start life with a sketch of conditional probabilities.)
META: I made a comment in discussion about the article and add there my consideration why it is not bad to be BB, may be we could move discussion there?
http://lesswrong.com/r/discussion/lw/ol5/open_thread_feb_06_feb_12_2017/dmmr