Are you in a Boltzmann simulation?

EDIT: Don­ald Hob­son has pointed out a mis­take in the rea­son­ing in the sec­tion on nu­cle­ation. If we know that an area of space-time has a dis­pro­por­tionately high num­ber of ob­server mo­ments, then it is very likely that these are from long-lived Boltz­mann simu­la­tions. How­ever, this does not im­ply, as I thought, that most ob­server mo­ments are in long-lived Boltz­mann simu­la­tions.

Most peo­ple on LessWrong are fa­mil­iar with the con­cept of Boltz­mann brains—con­scious brains that are ran­domly cre­ated by quan­tum or ther­mo­dy­namic in­ter­ac­tions, and then swiftly van­ish.

There seem to be two types of Boltz­mann brains: quan­tum fluc­tu­a­tions, and nu­cle­ated brains (ac­tual brains pro­duced in the vac­uum of space by the ex­pand­ing uni­verse).

The quan­tum fluc­tu­a­tion brains can­not be ob­served (the fluc­tu­a­tion dies down with­out pro­duc­ing any ob­serv­able effect: there’s no de­co­her­ence or per­ma­nence). If I’m read­ing these pa­pers right, the prob­a­bil­ity of pro­duc­ing any given ob­ject of du­ra­tion and mass is approximately

We’ll be tak­ing for a hu­man brain, and for it hav­ing a sin­gle co­her­ent though.

A few notes about this num­ber. First of all, it is van­ish­ingly small, as an ex­po­nen­tial of a nega­tive ex­po­nen­tial. It’s so small, in fact, that we don’t re­ally care too much over what vol­ume of space-time we’re calcu­lat­ing this prob­a­bil­ity. Over a Plank length four-vol­ume, over a me­tre to the fourth power, or over a Hub­ble vol­ume for 15 billion years: the prob­a­bil­ities of an ob­ject be­ing pro­duced in any of these spaces are ap­prox­i­mately all of the same mag­ni­tude, (more prop­erly, the prob­a­bil­ities vary tremen­dously, but any tiny un­cer­tainty in the term dwarfs all these changes).

Similarly, we don’t need to con­sider the en­tropy of pro­duc­ing a spe­cific brain (or any other struc­ture). A small change in mass over­whelms the prob­a­bil­ity of a spe­cific mass setup be­ing pro­duced. Here’s a rough ar­gu­ment for that: the Beken­stein bound puts a limit on the num­ber of bits of in­for­ma­tion in a vol­ume of space of given size and given mass (or en­ergy). For a mass and ra­dius , it is approximately

Put­ting and , we get that the num­ber of pos­si­ble differ­ent states in brain-like ob­ject of brain-like mass and size is less than

which is much, much, much, …, much, much less than the in­verse of .

Quan­tum fluc­tu­a­tions and causality

What is in­ter­est­ing is that the prob­a­bil­ity ex­pres­sion is ex­po­nen­tially lin­ear in :

There­fore it seems that the prob­a­bil­ity of pro­duc­ing one brain of du­ra­tion , and an­other in­de­pen­dent brain of du­ra­tion , is the same as pro­duc­ing one brain of du­ra­tion . Thus it seems that there is no causal­ity in quan­tum fluc­tu­at­ing Boltz­mann brains: any brain pro­duced of long du­ra­tion is merely a se­quence of smaller brain mo­ments that hap­pen to be co­in­ci­den­tally fol­low­ing each other (though I may have mi­s­un­der­stood the pa­pers here).

Nu­cle­ation and Boltz­mann simulations

If we un­der­stand dark en­ergy cor­rectly, it will trans­form our uni­verse into a de Sit­ter uni­verse. In such a uni­verse, the con­tin­u­ing ex­pan­sion of the uni­verse acts like the event hori­zon of a black hole, and some­times, spon­ta­neous ob­jects will be cre­ated, similarly to Hawk­ing ra­di­a­tion. Thus a de Sit­ter space can nu­cle­ate: spon­ta­neously cre­ate ob­jects. The prob­a­bil­ity of a given ob­ject of mass be­ing pro­duced is given as

This num­ber is much, much, much, much, …., much, much, much, much smaller than the quan­tum fluc­tu­a­tion prob­a­bil­ity. But no­tice some­thing in­ter­est­ing about it: it has no time com­po­nent. In­deed, the ob­jects pro­duced by nu­cle­ation are ac­tual ob­jects: they en­dure. Think a brain in a sealed jar, float­ing through space.

Now, a nor­mal brain in empty space (and al­most ab­solute zero-tem­per­a­tures) will de­cay at once; let’s be gen­er­ous, and give it a sec­ond of sur­vival in some­thing like a func­tion­ing state.

EDIT: there is a mis­take in the fol­low­ing, see here.

Creat­ing in­de­pen­dent one-sec­ond brains is an event of prob­a­bil­ity:

But cre­at­ing a brain that lasts for sec­onds will be an event of probability

where is the min­i­mum mass re­quired to keep the brain run­ning for sec­onds.

It’s clear that can be way be­low . For ex­am­ple, the longest moon­walk was 7 h 36 min 56 s (Apollo 17, sec­ond moon­walk), or 27,416 sec­onds. To do this the as­tro­nauts used space­suits of mass around 82kg. If you es­ti­mate that their own body mass was roughly 100kg, we get .

This means that for nu­cle­ated Boltz­mann brains, un­like for quan­tum fluc­tu­a­tions, most ob­server mo­ments will be parts of long lived in­di­vi­d­u­als, with a life ex­pe­rience that re­spects causal­ity.

And we can get much much more effi­cient than that. Since mass is the real limit, there’s no prob­lem in us­ing anti-mat­ter as source of en­ergy. The hu­man brain runs at about 20 watts; one half gram of mat­ter with one half gram of anti-mat­ter pro­duces enough en­ergy to run this for about sec­onds, or 140 thou­sand years. Now, granted, you’ll need a larger in­fras­truc­ture to ex­tract and make use of this en­ergy, and to shield and re­pair the brain; how­ever, this larger in­fras­truc­ture doesn’t need to have a mass any­where near kilos (which is the or­der of mag­ni­tude of the mass of the moon it­self).

And that’s all ne­glect­ing im­prove­ments to the en­ergy effi­ciency and dura­bil­ity of the brain. It seems that the most effi­cient and durable ver­sion of a brain—in terms of mass, which is the only thing that mat­ters here—is to run the brain on a small but re­silient com­puter, with as much power as we’d want. And, if we get to re-use code, then we can run many brains on a slightly larger com­puter, with the mass growth be­ing less than the growth in the num­ber of brains.

Thus, most nu­cle­ated Boltz­mann brain ob­server-mo­ments will be in­side a Boltz­mann simu­la­tion: a spon­ta­neous (and causal) com­puter simu­la­tion cre­ated in the deep dark­ness of space.