Correct. In fact, I probably confused things here by using the word “discount” for what I am suggesting here. Let me try to summarize the situation with regard to “discounting”.
Time discounting means counting distant future utility as less important than near future utility. EY, in the cited posting, argues against time discounting. (I disagree with EY, for what it is worth.)
“Space discounting” is a locally well-understood idea that utility accruing to people distant from the focal agent is less important than utility accruing to the focal agent’s friends, family, and neighbors. EY presumably disapproves of space discounting. (My position is a bit complicated. Distance in space is not the relevant parameter, but I do approve of discounting using a similar ‘remoteness’ parameter.)
The kind of ‘discounting’ of large utilities that I recommended in the great-grandparent probably shouldn’t be called ‘discounting’. I would sloganize it as “utilities are not additive.” The parent used the phrase ‘diminishing returns’. That is not right either, though it is probably better than ‘discounting’. Another phrase that approximates what I was suggesting is ‘bounded utility’. (I’m pretty sure I disagree with EY on this one too.)
The fact that I disagree with EY on discounting says absolutely nothing about whether I agree with EY on AI risk, reductionism, exercise, and who writes the best SciFi. That shouldn’t need to be said, but sometimes it seems to be necessary in your (XiXiDu’s) case.
Correct. In fact, I probably confused things here by using the word “discount” for what I am suggesting here. Let me try to summarize the situation with regard to “discounting”.
Time discounting means counting distant future utility as less important than near future utility. EY, in the cited posting, argues against time discounting. (I disagree with EY, for what it is worth.)
“Space discounting” is a locally well-understood idea that utility accruing to people distant from the focal agent is less important than utility accruing to the focal agent’s friends, family, and neighbors. EY presumably disapproves of space discounting. (My position is a bit complicated. Distance in space is not the relevant parameter, but I do approve of discounting using a similar ‘remoteness’ parameter.)
The kind of ‘discounting’ of large utilities that I recommended in the great-grandparent probably shouldn’t be called ‘discounting’. I would sloganize it as “utilities are not additive.” The parent used the phrase ‘diminishing returns’. That is not right either, though it is probably better than ‘discounting’. Another phrase that approximates what I was suggesting is ‘bounded utility’. (I’m pretty sure I disagree with EY on this one too.)
The fact that I disagree with EY on discounting says absolutely nothing about whether I agree with EY on AI risk, reductionism, exercise, and who writes the best SciFi. That shouldn’t need to be said, but sometimes it seems to be necessary in your (XiXiDu’s) case.
How about: “Large utilities are not additive for humans”.