Anthropics: A Short Note on the Fission Riddle

In the I-Less Eye, R Wal­lace con­sid­ers a situ­a­tion where an en­tity clones it­self a hun­dred times that leads to a sur­pris­ing para­dox. I’ll ar­gue that there’s a rather sim­ple flaw with the ar­gu­ment made in the linked ar­ti­cle, but first I’ll sum­marise the rid­dle.

Sup­pose you are the origi­nal. After you’ve cloned your­self, you now have a 50% chance of be­ing the origi­nal and 50% be­ing the clone. This should ap­ply re­gard­less of what clones already ex­ist, so af­ter clon­ing your­self 100 times, you should have a 1/​2^100 chance of be­ing the origi­nal.

On the other hand, this seems strange. In­tu­itively, it seems as though you ought to have a 1101 chance of be­ing the origi­nal as there are 101 copies. Fur­ther, why does clon­ing one at a time give a differ­ent an­swer from cre­at­ing all 101 clones at once?

Solution

In or­der to solve this rid­dle, we only have to figure out what hap­pens when you’ve been cloned twice and whether the an­swer to this should be 13 or 14. The first step is cor­rect, the sub­jec­tive prob­a­bil­ity of be­ing the origi­nal should be 12 af­ter you’ve pressed the clon­ing but­ton once. How­ever, af­ter we’ve pressed the clon­ing but­ton twice, in ad­di­tion to the agents who un­der­went that first split, we now have an agent that falsely re­mem­bers un­der­go­ing that split.

Distribut­ing the prob­a­bil­ity evenly be­tween the agent’s who ei­ther had that ex­pe­rience or re­mem­ber it: we get a 13 chance of be­ing a false mem­ory and a 23 chance of it be­ing a real mem­ory. If it is a real mem­ory, then half of that—that is a 13 - is the prob­a­bil­ity of be­ing the origi­nal and the other half—also 13 - is the chance of be­ing the first clone.

So, the an­swer at the sec­ond step should be 13 in­stead of 14. Con­tinued ap­pli­ca­tion will provide the an­swer for 100 copies. I’ll ad­mit that I’ve more sketched out the rea­son­ing for the 1/​n solu­tion in­stead of pro­vid­ing a for­mal proof. How­ever, I would sug­gest that I’ve suffi­ciently demon­strated that halv­ing each time is mis­taken as it as­sumes that each re­mem­bered split is real.

How­ever, we can ver­ify that the 1/​n solu­tion pro­duces sen­si­ble re­sults. Ex­actly two agents ex­pe­rience the pro­cess of each split (but more agents re­mem­ber it). So there is a 2/​n chance of you ex­pe­rienc­ing the pro­cess and a 1/​n chance of you ex­pe­rienc­ing it as the origi­nal. So there is a 50% chance of you “wak­ing up” as the origi­nal af­ter the split if you ac­tu­ally un­der­went the split and didn’t just falsely re­mem­ber it.

Please note that I’m not tak­ing a po­si­tion in this post as to whether sub­jec­tive prob­a­bil­ities ul­ti­mately are or aren’t co­her­ent, just ar­gu­ing that this par­tic­u­lar ar­gu­ment is fal­la­cious.

Fi­nally, I’ll note a few ques­tions that this opens up. If we have to in­clude all agents who re­mem­ber a situ­a­tion in an­thropic rea­son­ing and not just those who ex­pe­rience it, what ac­tu­ally counts as re­mem­ber­ing a situ­a­tion? After all, in the real world, mem­o­ries are always im­perfect? Se­condly, what if an agent has a mem­ory, but never ac­cesses it? Does that still count?

EDIT: As you can see from my re­sponse to the com­ments, this post has some is­sues. Hope­fully, I am able to up­date it at some point, but this isn’t an im­me­di­ate pri­or­ity.

No nominations.
No reviews.