About atheists vs theists and undergrads vs philosophers, I think two-boxing is a position that preys on your self-image as a rationalist. It feels like you are getting punished for being rational, like you are losing not because of your choice, but because of who you are (I would say your choice is embedded in who you are, so there is no difference). One-boxing feels like magical thinking. Atheists and philosophers have stronger self-images as rationalists. Most haven’t grokked this:
How can you improve your conception of rationality? Not by saying to yourself, “It is my duty to be rational.” By this you only enshrine your mistaken conception. Perhaps your conception of rationality is that it is rational to believe the words of the Great Teacher, and the Great Teacher says, “The sky is green,” and you look up at the sky and see blue. If you think: “It may look like the sky is blue, but rationality is to believe the words of the Great Teacher,” you lose a chance to discover your mistake.
Will’s link has an Asimov quote that supports the “self-image vs right answer” idea, at least for Asimov:
I would, without hesitation, take both boxes . . . I am myself a determinist, but it is perfectly clear to me that any human being worthy of being considered a human being (including most certainly myself) would prefer free will, if such a thing could exist. . . Now, then, suppose you take both boxes and it turns out (as it almost certainly will) that God has foreseen this and placed nothing in the second box. You will then, at least, have expressed your willingness to gamble on his nonomniscience and on your own free will and will have willingly given up a million dollars for the sake of that willingness-itself a snap of the finger in the face of the Almighty and a vote, however futile, for free will. . . And, of course, if God has muffed and left a million dollars in the box, then not only will you have gained that million, but far more imponant you will have demonstrated God’s nonomniscience.9
But losing the million dollars also shoves in your face your ultimate predictability.
Voluntarily taking a loss in order to insult yourself doesn’t seem rational to me.
Plus, that’s not a form of free will I even care about. I like that my insides obey laws. I’m not fond of the massive privacy violation, but that’d be there or not regardless of my choice.
Two boxing is relying on your own power to think and act. One boxing is basically knuckling under to the supposed power of the predictor. I will trust that I am helpless before him. He is an agent and I am his object.
Theists one box and atheists two box. Not surprising.
The twist here at LW is that lots of people seem terribly enamored of these recursive ill posed problems, and think they have solutions to them. That’s where the one boxing comes from here, IMO.
I’d one box because there’s no way I’d risk losing a million dollars to get an extra thousand based on arguments about a problem which bores me so much I have trouble paying attention to it.
Best analysis of Newcomb’s Paradox I’ve seen so far—boring. There’s nothing to see here. It all comes down to how you model the situation and what your priors are.
I find it hard to imagine a situation where I have more belief in the Predictor’s ability than the ability of the Predictor to give false evidence that I can’t figure out the trick of.
I’d two box because I see no reason to risk of losing anything. In the face of perceived trickery, I’m all the more betting on causality.
Addendum:
About atheists vs theists and undergrads vs philosophers, I think two-boxing is a position that preys on your self-image as a rationalist. It feels like you are getting punished for being rational, like you are losing not because of your choice, but because of who you are (I would say your choice is embedded in who you are, so there is no difference). One-boxing feels like magical thinking. Atheists and philosophers have stronger self-images as rationalists. Most haven’t grokked this:
Will’s link has an Asimov quote that supports the “self-image vs right answer” idea, at least for Asimov:
Seems like Asimov isn’t taking the stakes seriously enough. Maybe we should replace “a million dollars” with “your daughter here gets to live.”
And only coincidentally signalling that his status is worth more than a million dollars.
But losing the million dollars also shoves in your face your ultimate predictability.
Voluntarily taking a loss in order to insult yourself doesn’t seem rational to me.
Plus, that’s not a form of free will I even care about. I like that my insides obey laws. I’m not fond of the massive privacy violation, but that’d be there or not regardless of my choice.
Two boxing is relying on your own power to think and act. One boxing is basically knuckling under to the supposed power of the predictor. I will trust that I am helpless before him. He is an agent and I am his object.
Theists one box and atheists two box. Not surprising.
The twist here at LW is that lots of people seem terribly enamored of these recursive ill posed problems, and think they have solutions to them. That’s where the one boxing comes from here, IMO.
I’d one box because there’s no way I’d risk losing a million dollars to get an extra thousand based on arguments about a problem which bores me so much I have trouble paying attention to it.
What if Box B contains $1,500 instead of $1,000,000 but Omega has still been right 999 times out of 1000?
You did get me to pay a little more attention to the problem. I’d two box in that case. I’m not sure where my crossover is.
Edited to add: I think I got it backwards. I’d still one box. Committing to one-box seems advantageous if Omega is reasonably reliable.
I suppose that then you could numbers on whether the person will reliably keep commitments.
Best analysis of Newcomb’s Paradox I’ve seen so far—boring. There’s nothing to see here. It all comes down to how you model the situation and what your priors are.
I find it hard to imagine a situation where I have more belief in the Predictor’s ability than the ability of the Predictor to give false evidence that I can’t figure out the trick of.
I’d two box because I see no reason to risk of losing anything. In the face of perceived trickery, I’m all the more betting on causality.