The exact values of the payoff depend upon the unknown details of how the Oracle achieves its 99% accuracy. Here is an inconvenient scenario that is consistent with the given description:
Suppose that 99% of the population don’t even think about the decision, they just follow inherent preferences that are nearly equally prevalent. The Oracle gets almost all of these correct, but fails on 1% of inherent one-boxers giving them nothing instead of $1,000,000. In the remaining cases where people do actually think about it, the Oracle is always wrong, and everyone who thinks about the situation knows this.
Since you are actually thinking about it, then you’re one of the people for whom the Oracle’s prediction is always wrong. If you end up taking one box, then it will be empty. If you take both boxes, you will find $1,001,000. In this scenario CDT, EDT, and FDT all agree that you should take both boxes.
In many other scenarios, possibly even “most” in some ill-defined sense, FDT says you should take one box.
The exact values of the payoff depend upon the unknown details of how the Oracle achieves its 99% accuracy. Here is an inconvenient scenario that is consistent with the given description:
Suppose that 99% of the population don’t even think about the decision, they just follow inherent preferences that are nearly equally prevalent. The Oracle gets almost all of these correct, but fails on 1% of inherent one-boxers giving them nothing instead of $1,000,000. In the remaining cases where people do actually think about it, the Oracle is always wrong, and everyone who thinks about the situation knows this.
Since you are actually thinking about it, then you’re one of the people for whom the Oracle’s prediction is always wrong. If you end up taking one box, then it will be empty. If you take both boxes, you will find $1,001,000. In this scenario CDT, EDT, and FDT all agree that you should take both boxes.
In many other scenarios, possibly even “most” in some ill-defined sense, FDT says you should take one box.