Is Evidential Decision Theory presumptuous?

I re­cently had a con­ver­sa­tion with a staunch defen­der of EDT who main­tained that EDT gives the right an­swer in the Smoker’s Le­sion and even Ev­i­den­tial Black­mail. I came up with the fol­low­ing, even more coun­ter­in­tu­itive, thought ex­per­i­ment:

--

By do­ing re­search, you’ve found out that there is either

(A) only one uni­verse or

(B) a mul­ti­verse.

You also found out that the cos­molog­i­cal the­ory has a slight in­fluence (via differ­ent physics) on how your brain works. If (A) holds, you will likely de­cide to give away all your money to ran­dom strangers on the street; if there is a mul­ti­verse, you will most likely not do that. Of course, causal­ity flows in one di­rec­tion only, i.e. your de­ci­sion does not de­ter­mine how many uni­verses there are.

Sup­pose you have a very strong prefer­ence for (A) (e.g. be­cause a mul­ti­verse would con­tain in­finite suffer­ing) so that it is more im­por­tant to you than your money.

Do you give away all your money or not?

--

This is struc­turally equiv­a­lent to the Smoker’s le­sion, but what’s caus­ing your ac­tion is the cos­molog­i­cal the­ory, not a le­sion or a gene. CDT, TDT, and UDT would not give away the money be­cause there is no causal (or acausal) in­fluence on the num­ber of uni­verses. EDT would rea­son that giv­ing the money away is ev­i­dence for (A) and there­fore choose to do so.

Apart from the usual “man­ag­ing the news” point, this high­lights an­other flaw in EDT: its pre­sump­tu­ous­ness. The EDT agent thinks that her de­ci­sion spawns or de­stroys the en­tire mul­ti­verse, or at least rea­sons as if. In other words, EDT acts as if it af­fects as­tro­nom­i­cal stakes with a sin­gle thought.

I find this highly coun­ter­in­tu­itive.

What makes it even worse is that this is not even a con­trived thought ex­per­i­ment. Our brains are in fact shaped by physics, and it is plau­si­ble that differ­ent phys­i­cal the­o­ries or con­stants both make an agent de­cide differ­ently and make the world bet­ter or worse ac­cord­ing to one’s val­ues. So, EDT agents might ac­tu­ally rea­son in this way in the real world.