BBs can’t make correct judgement about their reality. Their judgements are random. So 50 per cent BBs think that they are in non-random reality even if they are in random. So your experience doesn’t provide any information if you are BB or not. Only prior matters, and the prior is high.
Their judgements are random. So 50 per cent BBs think that they are in non-random reality even if they are in random.
The quoted figure does not follow. Random, yes; but it’s not a coinflip. Given that a Boltzmann Brain can randomly appear with any set of memories, and given that the potential set of random universes is vastly larger than the potential set of non-random universes, I’d imagine that the odds of a randomly-selected Boltzmann Brain thinking it is in a non-random universe are pretty low...
Hmmm. If the Boltzmann Brain has no time to think and update its own opinions from its own memory, then it is overwhelmingly likely that it has no opinion one way or another about whether or not it is in a random universe. In fact, it is overwhelmingly likely that it does not even understand the question, because its mindspace does not include the concepts of both “random” and “universe”...
Of course most BBs don’t not think about whether are they random or not. But from subset of BBs who have thoughts about it (we cant say they are thinking as it is longer process), its thoughts are random, and 50 per cent thinks that they are not random.
So experience updating of BB probabilities is not strong, but I am still not afraid to be BB by two other reasons.
Any BB is a copy of a real observer, and so I am real. (depends of identity solving)
BBs and real observers are not dominating class of observers. There is a third class, that is Bolzmann supercomputers which simulate our reality. They a medium size fluctuation which are very effective in creation trillions of observers moments which are rather consistent. But small amount of randomness also exist in such simulated universes ( it could be experimentally found). Hope to elaborate the idea in long post soon.
Found the similar idea in recent article about Boltzmann Brains:
“What we can do, however, is recognize that it’s no way to go through life. The data that an observer just like us has access to includes not only our physical environment, but all of the (purported) memories and knowledge in our brains. In a randomly-fluctuating scenario, there’s no reason for this “knowledge” to have any correlation whatsoever with the world outside our immediate sensory reach. In particular, it’s overwhelmingly likely that everything we think we know about the laws of physics, and the cosmological model we have constructed that predicts we are likely to be random fluctuations, has randomly fluctuated into our heads. There is certainly no reason to trust that our knowledge is accurate, or that we have correctly deduced the predictions of this cosmological model.”
https://arxiv.org/pdf/1702.00850.pdf
As I said before about skeptical scenarios: you cannot refute them by argument, by definition, because the person arguing for the skeptical scenario will say, “since you are in this skeptical scenario, your argument is wrong no matter how convincing it seems to you.”
But we do not believe those scenarios, and that includes the Boltzmann Brain theory, because they are not useful for any purpose. In other words, if you are a Boltzmann Brain, you have no idea what would be good to do, and in fact according to the theory you cannot do anything because you will not exist one second from now.
I don’t think that’s descriptively true at all. Regardless of whether or not I see a useful way to address it, I still wouldn’t expect to dissolve momentarily with no warning.
Now, this may be because humans can’t easily believe in novel claims. But “my” experience certainly seems more coherent than one would expect a BB’s to seem, and this calls out for explanation.
A Boltzmann brain has no way to know anything, reason to any conclusion, or whatever. So it has no way to know whether its experience should seem coherent or not. So your claim that this needs explanation is an unjustified assumption, if you are a Boltzmann brain.
Which conclusion? I believe that a Boltzmann brain cannot validly believe or reason about anything, and I certainly believe that I am not a Boltzmann brain.
More importantly, I believe everything I said there.
Seems like you’re using a confusing definition of “believe”, but the point is that I disagree about our reasons for rejecting the claim that you’re a BB.
Note that according to your reasoning, any theory which says you’re a BB must give us a uniform distribution for all possible experiences. So rationally coming to assign high probability to that theory seems nearly impossible if your experience is not actually random.
My reason for rejecting the claim of BB is that the claim is useless—and I am quite sure that is my reason. I would definitely reject it for that reason even if I had an argument that seemed extremely convincing to me that there is a 95% chance I am a BB.
A theory that says I am a BB cannot assign a probability to anything, not even by giving a uniform distribution. A BB theory is like a theory that says, “you are always wrong.” You cannot get any probability assignments from that, since as soon as you bring them up, the theory will say your assignments are wrong. In a similar way, a BB theory implies that you have never learned or studied probability theory. So you do not know whether probabilities should sum to 100% (or to any similar normalized result) or anything else about probability theory.
As I said, BB theory is useless—and part of its uselessness is that it cannot imply any conclusions, not even any kind of prior over your experiences.
I’m using probability to represent personal uncertainty, and I am not a BB. So I think I can legitimately assign the theory a distribution to represent uncertainty, even if believing the theory would make me more uncertain than that. (Note that if we try to include radical logical uncertainty in the distribution, it’s hard to argue the numbers would change. If a uniform distribution “is wrong,” how would I know what I should be assigning high probability to?)
I don’t think you assign a 95% chance to being a BB, or even that you could do so without severe mental illness. Because for starters:
Humans who really believe their actions mean nothing don’t say, “I’ll just pretend that isn’t so.” They stop functioning. Perhaps you meant the bar is literally 5% for meaningful action, and if you thought it was 0.1% you’d stop typing?
I would agree if you’d said that evolution hardwired certain premises or approximate priors into us ‘because it was useful’ to evolution. I do not believe that humans can use the sort of pascalian reasoning you claim to use here, not when the issue is BB or not BB. Nor do I believe it is in any way necessary. (Also, the link doesn’t make this clear, but a true prior would need to include conditional probabilities under all theories being considered. Humans, too, start life with a sketch of conditional probabilities.)
META: I made a comment in discussion about the article and add there my consideration why it is not bad to be BB, may be we could move discussion there?
BBs can’t make correct judgement about their reality. Their judgements are random. So 50 per cent BBs think that they are in non-random reality even if they are in random. So your experience doesn’t provide any information if you are BB or not. Only prior matters, and the prior is high.
The quoted figure does not follow. Random, yes; but it’s not a coinflip. Given that a Boltzmann Brain can randomly appear with any set of memories, and given that the potential set of random universes is vastly larger than the potential set of non-random universes, I’d imagine that the odds of a randomly-selected Boltzmann Brain thinking it is in a non-random universe are pretty low...
It will be true if BB would have time to think about their experiences and ability to come to logical conclusions. But BBs opinions are also random.
Hmmm. If the Boltzmann Brain has no time to think and update its own opinions from its own memory, then it is overwhelmingly likely that it has no opinion one way or another about whether or not it is in a random universe. In fact, it is overwhelmingly likely that it does not even understand the question, because its mindspace does not include the concepts of both “random” and “universe”...
Of course most BBs don’t not think about whether are they random or not. But from subset of BBs who have thoughts about it (we cant say they are thinking as it is longer process), its thoughts are random, and 50 per cent thinks that they are not random. So experience updating of BB probabilities is not strong, but I am still not afraid to be BB by two other reasons.
Any BB is a copy of a real observer, and so I am real. (depends of identity solving)
BBs and real observers are not dominating class of observers. There is a third class, that is Bolzmann supercomputers which simulate our reality. They a medium size fluctuation which are very effective in creation trillions of observers moments which are rather consistent. But small amount of randomness also exist in such simulated universes ( it could be experimentally found). Hope to elaborate the idea in long post soon.
Found the similar idea in recent article about Boltzmann Brains:
“What we can do, however, is recognize that it’s no way to go through life. The data that an observer just like us has access to includes not only our physical environment, but all of the (purported) memories and knowledge in our brains. In a randomly-fluctuating scenario, there’s no reason for this “knowledge” to have any correlation whatsoever with the world outside our immediate sensory reach. In particular, it’s overwhelmingly likely that everything we think we know about the laws of physics, and the cosmological model we have constructed that predicts we are likely to be random fluctuations, has randomly fluctuated into our heads. There is certainly no reason to trust that our knowledge is accurate, or that we have correctly deduced the predictions of this cosmological model.” https://arxiv.org/pdf/1702.00850.pdf
As I said before about skeptical scenarios: you cannot refute them by argument, by definition, because the person arguing for the skeptical scenario will say, “since you are in this skeptical scenario, your argument is wrong no matter how convincing it seems to you.”
But we do not believe those scenarios, and that includes the Boltzmann Brain theory, because they are not useful for any purpose. In other words, if you are a Boltzmann Brain, you have no idea what would be good to do, and in fact according to the theory you cannot do anything because you will not exist one second from now.
I don’t think that’s descriptively true at all. Regardless of whether or not I see a useful way to address it, I still wouldn’t expect to dissolve momentarily with no warning.
Now, this may be because humans can’t easily believe in novel claims. But “my” experience certainly seems more coherent than one would expect a BB’s to seem, and this calls out for explanation.
A Boltzmann brain has no way to know anything, reason to any conclusion, or whatever. So it has no way to know whether its experience should seem coherent or not. So your claim that this needs explanation is an unjustified assumption, if you are a Boltzmann brain.
One man’s modus ponens is another man’s modus tollens. I don’t even believe that you believe the conclusion.
Which conclusion? I believe that a Boltzmann brain cannot validly believe or reason about anything, and I certainly believe that I am not a Boltzmann brain.
More importantly, I believe everything I said there.
Seems like you’re using a confusing definition of “believe”, but the point is that I disagree about our reasons for rejecting the claim that you’re a BB.
Note that according to your reasoning, any theory which says you’re a BB must give us a uniform distribution for all possible experiences. So rationally coming to assign high probability to that theory seems nearly impossible if your experience is not actually random.
My reason for rejecting the claim of BB is that the claim is useless—and I am quite sure that is my reason. I would definitely reject it for that reason even if I had an argument that seemed extremely convincing to me that there is a 95% chance I am a BB.
A theory that says I am a BB cannot assign a probability to anything, not even by giving a uniform distribution. A BB theory is like a theory that says, “you are always wrong.” You cannot get any probability assignments from that, since as soon as you bring them up, the theory will say your assignments are wrong. In a similar way, a BB theory implies that you have never learned or studied probability theory. So you do not know whether probabilities should sum to 100% (or to any similar normalized result) or anything else about probability theory.
As I said, BB theory is useless—and part of its uselessness is that it cannot imply any conclusions, not even any kind of prior over your experiences.
I’m using probability to represent personal uncertainty, and I am not a BB. So I think I can legitimately assign the theory a distribution to represent uncertainty, even if believing the theory would make me more uncertain than that. (Note that if we try to include radical logical uncertainty in the distribution, it’s hard to argue the numbers would change. If a uniform distribution “is wrong,” how would I know what I should be assigning high probability to?)
I don’t think you assign a 95% chance to being a BB, or even that you could do so without severe mental illness. Because for starters:
Humans who really believe their actions mean nothing don’t say, “I’ll just pretend that isn’t so.” They stop functioning. Perhaps you meant the bar is literally 5% for meaningful action, and if you thought it was 0.1% you’d stop typing?
I would agree if you’d said that evolution hardwired certain premises or approximate priors into us ‘because it was useful’ to evolution. I do not believe that humans can use the sort of pascalian reasoning you claim to use here, not when the issue is BB or not BB. Nor do I believe it is in any way necessary. (Also, the link doesn’t make this clear, but a true prior would need to include conditional probabilities under all theories being considered. Humans, too, start life with a sketch of conditional probabilities.)
META: I made a comment in discussion about the article and add there my consideration why it is not bad to be BB, may be we could move discussion there?
http://lesswrong.com/r/discussion/lw/ol5/open_thread_feb_06_feb_12_2017/dmmr