But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.