i don’t think you understood my argument. i didn’t say you assign each of them 1/1000th of your total caring. i said you should assign each of them 1/1000th as much caring as you assign yourself. so you should occupy 1000⁄7 billion of your caring, and Bob from Randomland occupies 1⁄7 billion of your caring.
the entire point of my argument is it actually doesn’t matter what % of your own caring you take up. that’s not the relevant thing. the relevant thing is how much you care about each stranger relative to yourself, and the shape of your money utility curve.
I think it’s wrong that humans empirically value themselves as 1/10000000 of the rest of humanity. I guess I see your point, that you have some budget of caring, and on occasion you are willing to dispense quite a lot of it to a single stranger. But you would not dispense 99.9999% of your caring to all the strangers combined.
% of your caring is a flawed metric that doesn’t mean anything though! even if there are 1e100 strangers out there, as long as your caring about each individual relative to yourself is still 1/1000, the fraction of your money you’re willing to donate remains constant!
You have some pool of caring you are willing to donate, then in the case of where all other humans need a donation, they will each receive pool/total_pop. Then you care about each of them as pool/total_pop.
Like, if one encounters an opportunity to donate to a single stranger who needs it, people go above that pool/total_pop, but it doesn’t mean they would give more than total of pool in previous case. The scaling is weird.
i don’t understand what in my original post you disagree with. there is no such thing as a fixed pool of caring, i don’t even know what that means. the actual constraint is you have some finite amount of money. money is not the same as caring because each dollar is worth a different amount depending on how much money the recipient has. caring is just a multiplier on how much other people’s happiness is worth to you compared to your own happiness. if some of your dollars will bring so much more happiness to someone else (eg by saving their life) than yourself (eg by buying a slightly larger apartment) that it outweighs the fact that you don’t care about them as much as yourself, then you should give that dollar away. otherwise, you shouldn’t.
when I read “caring” + the table I assume something roughly equal to “percentage of attention/money/other resources spent”, otherwise how would you normalize caring to 1 (as is done in the table)?
i don’t think you understood my argument. i didn’t say you assign each of them 1/1000th of your total caring. i said you should assign each of them 1/1000th as much caring as you assign yourself. so you should occupy 1000⁄7 billion of your caring, and Bob from Randomland occupies 1⁄7 billion of your caring.
the entire point of my argument is it actually doesn’t matter what % of your own caring you take up. that’s not the relevant thing. the relevant thing is how much you care about each stranger relative to yourself, and the shape of your money utility curve.
I think it’s wrong that humans empirically value themselves as 1/10000000 of the rest of humanity. I guess I see your point, that you have some budget of caring, and on occasion you are willing to dispense quite a lot of it to a single stranger. But you would not dispense 99.9999% of your caring to all the strangers combined.
% of your caring is a flawed metric that doesn’t mean anything though! even if there are 1e100 strangers out there, as long as your caring about each individual relative to yourself is still 1/1000, the fraction of your money you’re willing to donate remains constant!
I don’t get it.
You have some pool of caring you are willing to donate, then in the case of where all other humans need a donation, they will each receive pool/total_pop. Then you care about each of them as pool/total_pop.
Like, if one encounters an opportunity to donate to a single stranger who needs it, people go above that pool/total_pop, but it doesn’t mean they would give more than total of pool in previous case. The scaling is weird.
Your previous statements are unclear.
i don’t understand what in my original post you disagree with. there is no such thing as a fixed pool of caring, i don’t even know what that means. the actual constraint is you have some finite amount of money. money is not the same as caring because each dollar is worth a different amount depending on how much money the recipient has. caring is just a multiplier on how much other people’s happiness is worth to you compared to your own happiness. if some of your dollars will bring so much more happiness to someone else (eg by saving their life) than yourself (eg by buying a slightly larger apartment) that it outweighs the fact that you don’t care about them as much as yourself, then you should give that dollar away. otherwise, you shouldn’t.
when I read “caring” + the table I assume something roughly equal to “percentage of attention/money/other resources spent”, otherwise how would you normalize caring to 1 (as is done in the table)?