Hah, thanks, you’re right, I didn’t think of that. Omega would need to simultaneously induce amnesia in all siblings. There goes my nice setup :-) Maybe we can invent some other plausible-sounding scenario to that effect?
Parent (by Cousin It) and grandparent (by Doug) are wrong.
If I have nine siblings about whom I know absolutely nothing except that they are people, intelligent agents or observers in our reference class and that they exist, it is a error in reasoning for me to assume (like Doug’s argument does) that just because something unlikely happened to me, it happened or will happen to them, too.
That said, I do not know the solution to the original problem.
“Greetings, citizen. I, the King of this land, decided to perform an experiment on anthropics. This morning I flipped a coin, resolving that if it landed tails, then ten of my citizens would be drugged and wake up here...”
If you pick observers randomly from some pool of fixed size instead of creating them, the problem becomes non-anthropic. An ordinary citizen before the experiment should precommit to choosing chocolate, because this precommitment gives the average citizen higher expected utility.
So the setup is as before, we flip a coin to decide whether to create one or ten children, but now we don’t wait for an accident to make the chocolate/cake offer—on their thirtieth birthday, we track them all down and give them an amnesia drug.
Hah, thanks, you’re right, I didn’t think of that. Omega would need to simultaneously induce amnesia in all siblings. There goes my nice setup :-) Maybe we can invent some other plausible-sounding scenario to that effect?
Parent (by Cousin It) and grandparent (by Doug) are wrong.
If I have nine siblings about whom I know absolutely nothing except that they are people, intelligent agents or observers in our reference class and that they exist, it is a error in reasoning for me to assume (like Doug’s argument does) that just because something unlikely happened to me, it happened or will happen to them, too.
That said, I do not know the solution to the original problem.
“Greetings, citizen. I, the King of this land, decided to perform an experiment on anthropics. This morning I flipped a coin, resolving that if it landed tails, then ten of my citizens would be drugged and wake up here...”
If you pick observers randomly from some pool of fixed size instead of creating them, the problem becomes non-anthropic. An ordinary citizen before the experiment should precommit to choosing chocolate, because this precommitment gives the average citizen higher expected utility.
In that case can we just resolve to track down and amnesify all ten siblings in the case where there are ten?
I don’t understand… Maybe we’re talking about different things? Could you explain again?
So the setup is as before, we flip a coin to decide whether to create one or ten children, but now we don’t wait for an accident to make the chocolate/cake offer—on their thirtieth birthday, we track them all down and give them an amnesia drug.
This seems to give the same answer as the case with accidental amnesia, right?