“I can cause you to invert your preferences over time and pump some money out of you.”
I think the small qualifier you slipped in there, “over time”, is more salient than it appears at first.
Like most casually intuitive humans, I’ll prefer 1A over 1B, and (for the sake of this argument) 2B over 2A, and you can pump some money out of me for a bit.
But… as a somewhat rational thinker, you won’t be able to pump an unbounded amount of money out of me. Eventually I catch on to what you’re doing and your trickle of cents will disappear. I will go, “well, I don’t know what’s wrong with my feeble intuition, but I can tell that Elizer is going to end up with all my money this way, so I’ll stop even though it goes against my intuition.” If you want to accelerate this, make the stuff worth more than a cent. Tell someone that the “mathematically wrong choice will cost you $1,000,000″, and I bet they’ll take some time to think and choose a set of beliefs that can’t be money-pumped.
Or, change the time aspect. I suspect if I were immortal (or at least believed myself to be), I would happily choose 1B over 1A, and certainty be screwed. Maybe I don’t get the money, so what, I have an infinite amount of time to earn it back. It’s the fact that I don’t get to play the game an unlimited amount of times that makes certainty a more valuable aspect.
I can’t help but notice that all of your examples are that which elicit negative emotional reactions. I think it might be illustrative to also have some examples of this fallacy for situations where the group X elicits positive emotional reactions. For example, wild deer are cute, and therefore any movement to kill them must be bad. Or, rape victims are all deserving of our sympathy, therefore any portrayal of a rape victim as anything but pure innocence is bad. (These aren’t great examples, I admit.)