That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.