I think your example fails to accurately represent your actual values, even worse than the original thought experiment. Nothing in the world can be worth 1000x someone you truly care about, not even all the other people you care about combined. Human brains just can’t represent that sort emotional weight. It would have been more useful to think of your sister vs a distant friend of yours.
But honestly, there is no point even in substituting money for “true utility” in Newcomb’s problem, because unlike Prisoner’s dilemma, there are no altruism/virtue signaling considerations to interfere with your decision. You genuinely want the money when no ethical conundrum is involved. So maybe just put $500,000 in each box?
What really confuses me about Newcomb’s problem is why rationalists think it is important. When is the future ever so reliably predictable that acausal trades become important, in a universe where probability is engrained in its very fabric AND where chaotic systems are abundant?
I’ve since played at least one other Prisoner’s Dilemma game – this one for team points rather than candy – and I cooperated that time as well. In that situation, we were very explicitly groomed to feel empathy for our partner (by doing the classic ’36 questions to fall in love’ and then staring into each other’s eyes for five minutes), which I think majorly interferes with the utility calculations.
Sounds like the exercise was more about teambuilding than demonstrating Prisoner’s dilemma.
I think your example fails to accurately represent your actual values, even worse than the original thought experiment. Nothing in the world can be worth 1000x someone you truly care about, not even all the other people you care about combined. Human brains just can’t represent that sort emotional weight. It would have been more useful to think of your sister vs a distant friend of yours.
But honestly, there is no point even in substituting money for “true utility” in Newcomb’s problem, because unlike Prisoner’s dilemma, there are no altruism/virtue signaling considerations to interfere with your decision. You genuinely want the money when no ethical conundrum is involved. So maybe just put $500,000 in each box?
What really confuses me about Newcomb’s problem is why rationalists think it is important. When is the future ever so reliably predictable that acausal trades become important, in a universe where probability is engrained in its very fabric AND where chaotic systems are abundant?
Sounds like the exercise was more about teambuilding than demonstrating Prisoner’s dilemma.