This seems related to scope insensitivity and availability bias. No amount of money (that I have direct control of) is worth one human life ( in my Dunbar group). No money (which my mind exemplifies as $100k or whatever) is worth the life of my example human, a coworker. Even then, its false, but it’s understandable.
More importantly, categorizations of resources (and of people, probably) are map, not territory. The only rational preference ranking is over reachable states of the universe. Or, if you lean a bit far towards skepticism/solopcism, over sums of future experiences.
Preferences exist in the map, in human brains, and we want to port them to the territory with the minimum of distortion.
Oh, wait. I’ve been treating preferences as territory, though always expressed in map terms (because communication and conscious analysis is map-only). I’ll have to think about what it would mean if they were purely map artifacts.