I think your explanation may be correct, but I don’t understand why torture would be the intuitive answer even so. First, if I select torture, everyone in the universe gets tortured, which means I get tortured. If instead I select dust speck, I get a dust speck, which is vastly preferable. Second, I would prefer a universe with a bunch of me to one with just me, because I’m pretty awesome so more me is pretty much just better. Basically I just fail to see a downside to the dust speck scenario.
The downside to the dust speck scenario is that lots and lots and lots of you get dust-specked. But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
Perhaps we can fix it, as follows: Omega has actually set up two toy universes, one with 3^^^^3 of you who may or may not get dust-specked, one with just one of you who may or may not get tortured. Now Omega tells you the same as in ike’s original scenario, except that it’s “everyone sharing your toy universe” who will be either tortured or dust-specked.
But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
I think your explanation may be correct, but I don’t understand why torture would be the intuitive answer even so. First, if I select torture, everyone in the universe gets tortured, which means I get tortured. If instead I select dust speck, I get a dust speck, which is vastly preferable. Second, I would prefer a universe with a bunch of me to one with just me, because I’m pretty awesome so more me is pretty much just better. Basically I just fail to see a downside to the dust speck scenario.
The downside to the dust speck scenario is that lots and lots and lots of you get dust-specked. But yes, I think the thought experiment is seriously impaired by the fact that the existence of more copies of you is likely a bigger deal than whether they get dust-specked.
Perhaps we can fix it, as follows: Omega has actually set up two toy universes, one with 3^^^^3 of you who may or may not get dust-specked, one with just one of you who may or may not get tortured. Now Omega tells you the same as in ike’s original scenario, except that it’s “everyone sharing your toy universe” who will be either tortured or dust-specked.
The idea was that your choice doesn’t change the number of people, so this shouldn’t affect the answer.
That seems, if you don’t mind my saying so, an odd thing to say when discussing a version of Newcomb’s problem. (“Your choice doesn’t change what’s in the boxes, so …”)
In the first version, there’s no causal relation between your choice and the number of people in the world. In the third, there is, and in the middle one, anthropics must also be considered.
I gave multiple scenarios to make this point.
If the predictor in newcomb doesn’t touch the boxes but just tells you that they predict your choice is the same as what’s in the box, it turns into the smoking lesions scenario.
Specks is supposed to be the intuitive answer.
That’s why I gave scenarios where your choice doesn’t cause the number of people, which is where Newcomblike scenarios come in.