Let V be the hyper-volume where the probability of a Mkg BB is exactly exp[−M×1069]. Let’s imagine a sequence of V’s stretching forward in time. About exp[−1069] of them will contain one BB of mass 1 kg, and about exp[−2×1069] will contain a BB of mass 2kg, which is also the proportion that contains two brains of mass 1kg.
So I think you are correct; most observer-moments will still be in short-lived BBs. But if you are in an area with disproportionately many observer moments, then they are more likely to be in long-lived BBs. I will adjust the post to reflect this.
However, Boltzmann simulation may be much more efficient than biological brains. 1 g of advanced nanotech supercomputer could stimulate trillions observer-moments per second, and weight 1000 times less than “real” brain. This means that me are more likely to be inside BB-simulation when in a real BB. Also, most curse and primitive simulations with many errors should dominate.
It probably depends on how mass and time duration of the fluctuation are traded between themselves. For quantum fluctuations which return back to nothingness this relation is define by the principle of uncertainty, and for any fluctuations with significant mass, its time of existence would be minuscule share of a second, which would be enough only for one static observer-moment.
But if we able imagine very efficient in calculations computer, which could perform many calculations by the time allowed for its existence by uncertainty principle, it should dominate by number of observer-moments.
Let V be the hyper-volume where the probability of a Mkg BB is exactly exp[−M×1069]. Let’s imagine a sequence of V’s stretching forward in time. About exp[−1069] of them will contain one BB of mass 1 kg, and about exp[−2×1069] will contain a BB of mass 2kg, which is also the proportion that contains two brains of mass 1kg.
So I think you are correct; most observer-moments will still be in short-lived BBs. But if you are in an area with disproportionately many observer moments, then they are more likely to be in long-lived BBs. I will adjust the post to reflect this.
However, Boltzmann simulation may be much more efficient than biological brains. 1 g of advanced nanotech supercomputer could stimulate trillions observer-moments per second, and weight 1000 times less than “real” brain. This means that me are more likely to be inside BB-simulation when in a real BB. Also, most curse and primitive simulations with many errors should dominate.
That won’t fix the issue. Just redo the analysis at whatever size is able to mereky do a few seconds of brain simulation.
It probably depends on how mass and time duration of the fluctuation are traded between themselves. For quantum fluctuations which return back to nothingness this relation is define by the principle of uncertainty, and for any fluctuations with significant mass, its time of existence would be minuscule share of a second, which would be enough only for one static observer-moment.
But if we able imagine very efficient in calculations computer, which could perform many calculations by the time allowed for its existence by uncertainty principle, it should dominate by number of observer-moments.