A variant on the trolley problem and babies as unit of currency

I was discussing utilitarianism and charitable giving and similar ideas with someone today, and I came up with this hybrid version of the trolley problem, particularly the fat man variation, and the article by Scott Alexander/​Yvain about using dead children as a unit of currency. It’s not extremely original, and I’d be surprised if no-one on LW had thought of it before.

You are offered a magical box. If you press the button on the box, one person somewhere in the world will die, you get $6000, and $4,000 is donated to one of the top rated charities on GiveWell.org. According to the $800 per life saved figure, this charity gift would save five lives, which is a net gain of four lives and $6,000 to you. Is it moral to press the button?

All of the usual responses to the trolley problem apply. To wit: It’s good to have heuristics like “don’t kill.” There’s arguments about establishing Schelling points with regards to not killing people. (This Schelling point argument doesn’t work as well in a case like this, with anonymity and privacy and randomization of the person who gets killed.) Eliezer argued that for a human, being in the trolley problem is extraordinarily unlikely, and he would be willing to acknowledge that killing the fat man would be appropriate for an AI in the situation to do, but not a human.

There’s also lots of arguments against giving to charity, too. See here for some discussion of this on LessWrong.

I feel that the advantage of my dilemma is that in the original extreme altruism faces a whole lot of motivated cognition against it, because it implies that you should be giving much of your income to charity. In this dilemma, you want the $6,000, and so are inclined to be less skeptical of the charity’s effectiveness.

Possible use: Present this first, then argue for extreme altruism. This would annoy people, but as far as I can tell, pretty much everyone gets defensive and comes up with a rationalization for their selfishness when you bring up altruism anyway.

What would you people do?

EDIT: This $800 figure is probably out of date. $2000 is probably more accurate. However, it’s easy to simply increase the amount of money at stake in the thought experiment.

Edit 2: I fixed some swapped-around values, as kindly pointed out by Vaniver.