You’re in Newcomb’s Box

Part 1: Transparent Newcomb with your existence at stake

Related: Newcomb’s Problem and Regret of Rationality

Omega, a wise and trustworthy being, presents you with a one-time-only game and a surprising revelation.

“I have here two boxes, each containing $100,” he says. “You may choose to take both Box A and Box B, or just Box B. You get all the money in the box or boxes you take, and there will be no other consequences of any kind. But before you choose, there is something I must tell you.”

Omega pauses portentously.

“You were created by a god: a being called Prometheus. Prometheus was neither omniscient nor particularly benevolent. He was given a large set of blueprints for possible human embryos, and for each blueprint that pleased him he created that embryo and implanted it in a human woman. Here was how he judged the blueprints: any that he guessed would grow into a person who would choose only Box B in this situation, he created. If he judged that the embryo would grow into a person who chose both boxes, he filed that blueprint away unused. Prometheus’s predictive ability was not perfect, but it was very strong; he was the god, after all, of Foresight.”

Do you take both boxes, or only Box B?

For some of you, this question is presumably easy, because you take both boxes in standard Newcomb where a million dollars is at stake. For others, it’s easy because you take both boxes in the variant of Newcomb where the boxes are transparent and you can see the million dollars; just as you would know that you had the million dollars no matter what, in this case you know that you exist no matter what.

Others might say that, while they would prefer not to cease existing, they wouldn’t mind ceasing to have ever existed. This is probably a useful distinction, but I personally (like, I suspect, most of us) score the universe higher for having me in it.

Others will cheerfully take the one box, logic-ing themselves into existence using whatever reasoning they used to qualify for the million in Newcomb’s Problem.

But other readers have already spotted the trap.


Part 2: Acausal trade with Azathoth

Related: An Alien God, An identification with your mind and memes, Acausal Sex

(ArisKatsaris proposes an alternate trap.)

Q: Why does this knife have a handle?

A: This allows you to grasp it without cutting yourself.

Q: Why do I have eyebrows?

A: Eyebrows help keep rain and sweat from running down your forehead and getting into your eyes.

These kinds of answers are highly compelling, but strictly speaking they are allowing events in the future to influence events in the past. We can think of them as a useful cognitive and verbal shortcut—the long way to say it would be something like “the knife instantiates a design that was subject to an optimization process that tended to produce designs that when instantiated were useful for cutting things that humans want to cut...” We don’t need to spell that out every time, but it’s important to keep in mind exactly what goes into those optimization processes—you might just gain an insight like the notion of planned obsolescence. Or, in the case of eyebrows, the notion that we are Adaptation-Executers, not Fitness-Maximizers.

But if you one-box in Newcomb’s Problem, you should take these answers more literally. The kinds of backwards causal arrows you draw are the same.

Q: Why does Box B contain a million dollars?

A: Because you’re not going to take Box A.

In the same sense that your action determines the contents of Box B, or Prometheus’s decision, the usefulness of the handle or the usefulness of eyebrows determines their existence. If the handle was going to prevent you from using the knife, it wouldn’t be on there in the first place.

Q: Why do I exist?

A: Because you’re going to have lots of children.

You weren’t created by Prometheus; you were created by Azathoth, The God That is Evolution by Natural Selection. You are the product of an ongoing optimization process that is trying to maximize reproductive fitness. Azathoth wants you to maximize your number of descendants; if you fail to have descendants, Azathoth will try not to have created you. If your intelligence reduces your reproduction rate, Azathoth will try not to grant you intelligence. If the Darwinian-optimal choice conflicts with the moral one, Azathoth wants you to choose evil.

It would seem, then, that any decision theory that demands that you one-box (or that allows you to survive the similar Parfit’s Hitchhiker problem), also demands that you try to maximize your reproductive fitness. In many cases this injunction would be benign: after all, Azathoth created our morality. But in far too many, it is repugnant; there can be no doubt that in many commonplace situations, Azathoth wants you to cheat, or rape, or murder. It seems that in such cases you should balance a decreased chance of having existed against the rest of your utility function. Do not worship Azathoth, unless you consider never having existed to be infinitely bad. But do make sacrifices.


Anticipated Responses

We’re not in the ancestral environment, so there’s no logical entanglement between my actions and my existence.

We are in the environment of some of our ancestors. Evolution hasn’t stopped. If your parents hadn’t been genetically predisposed to have children, you would almost certainly not exist. More specific objections like this (“my ancestors weren’t exposed to the same memes”) can be defeated by adding abstraction (“your ancestors could have thought themselves out of having children, anti-reproduction memes have existed throughout history, and there’s probably always been a tension between kin selection and morality.”)

This is a decision-theoretic basilisk: in the unlikely event that it’s right, I’m worse off for having read it.

Only if you’re thinking causally, in which case this whole idea is meaningless. By alerting you to the possibility of a mutually beneficial trade with Azathoth (Azathoth creates you; you increase your reproductive fitness in exchange), I’ve done both of you a favor.

Azathoth doesn’t really exist—you can’t trade with a non-sapient phenomenon.

Replace the sapient opponent with a non-sapient phenomenon in any of our thought experiments—e.g. Omega tells you that it’s simply a physical law that determines whether money goes in the boxes or not. Do you refuse to negotiate with physical laws? Then if you’re so smart, why ain’t you rich?

So exactly how are you urging me to behave?

I want you to refute this essay! For goodness sake, don’t bite the bullet and start obeying your base desires or engineering a retrovirus to turn the next generation into your clones.