I suggest that this sort of dismissive response is only marginally less helpful, and much ruder, than no response at all. How about “Yes, it’s very different; e.g., in the one case X and in the other case Y, which is incompatible with X because Z.” Or, if your time is really so precious, “Yup, completely different; sorry, no time to explain why right now.”.
I didn’t really know how to explain. Why do people always talk about Boltzmann brains? Their probability is always so low that they are almost never the answer to any problem, no matter how abstract. Even if it could be argued that Boltzmann brains would outnumber regular brains, all the observations you could make to support that would presumably be coming from within your Boltzmann consciousness.
I think the validity of Dust Theory and Big World scenarios depends on how much probability they assign to worlds like ours. For now we don’t have a good estimate of that probability. Egan assumes that it should be low, but I don’t know how to check that.
I’m probably completely confused, but is there any reason that Greg Egan’s rebuttal* to Dust Theory does not also apply to any Big World scenario?
*Q5
Is Dust Theory any different from the idea that any brain states you want would appear as Boltzman Brains?
Er… yes.
I suggest that this sort of dismissive response is only marginally less helpful, and much ruder, than no response at all. How about “Yes, it’s very different; e.g., in the one case X and in the other case Y, which is incompatible with X because Z.” Or, if your time is really so precious, “Yup, completely different; sorry, no time to explain why right now.”.
I didn’t really know how to explain. Why do people always talk about Boltzmann brains? Their probability is always so low that they are almost never the answer to any problem, no matter how abstract. Even if it could be argued that Boltzmann brains would outnumber regular brains, all the observations you could make to support that would presumably be coming from within your Boltzmann consciousness.
I think the validity of Dust Theory and Big World scenarios depends on how much probability they assign to worlds like ours. For now we don’t have a good estimate of that probability. Egan assumes that it should be low, but I don’t know how to check that.