Right, I think positional goods and the like are among several distortions of the basic premises of the welfare theorems (and indeed empirically many people are sad, lonely, etc. in our modern world of abundance) - I sometimes think those theorems imply a sort of normative ‘well, just don’t worry about other people’s stuff!’ (i.e. non-envy, which is, after all, a deadly sin). cf Paretotopia, which makes exactly this normative case in the AI futurism frame.
Definitely agree! The “don’t worry about other people’s stuff” argument gets thrown around a lot, and is often assumed to be equivalent to “don’t be envious”, but I think that argument actually contains a logical mistake. Suppose person A doesn’t care directly about their relative wealth (i.e. their utility function is independent of their relative wealth), but their utility function depends on their interactions with other people (such as friendships, job interviews, etc) and those other people interact with person A in a way that depends on person A’s relative wealth. Then, it can be instrumentally useful for person A to increase their relative wealth despite their utility function being independent of relative wealth! So person A has no envy, but their utility is (indirectly) affected by their relative wealth.
Well, why do we envy? Evopsych just-so story says that, of course, others having much impinges on me whether I want it to or not (my security/precarity, liberty, relative success in interactive situations, …).
I think you can express that in a ‘goods’ framing by glossing it as the consumption of ‘status and interactive goods’ or something like that (but noting that these are positional and contingent on the wider playing field, which the bog standard welfare theorem utility functions aren’t).
Right, I think positional goods and the like are among several distortions of the basic premises of the welfare theorems (and indeed empirically many people are sad, lonely, etc. in our modern world of abundance) - I sometimes think those theorems imply a sort of normative ‘well, just don’t worry about other people’s stuff!’ (i.e. non-envy, which is, after all, a deadly sin). cf Paretotopia, which makes exactly this normative case in the AI futurism frame.
Definitely agree! The “don’t worry about other people’s stuff” argument gets thrown around a lot, and is often assumed to be equivalent to “don’t be envious”, but I think that argument actually contains a logical mistake. Suppose person A doesn’t care directly about their relative wealth (i.e. their utility function is independent of their relative wealth), but their utility function depends on their interactions with other people (such as friendships, job interviews, etc) and those other people interact with person A in a way that depends on person A’s relative wealth. Then, it can be instrumentally useful for person A to increase their relative wealth despite their utility function being independent of relative wealth! So person A has no envy, but their utility is (indirectly) affected by their relative wealth.
Well, why do we envy? Evopsych just-so story says that, of course, others having much impinges on me whether I want it to or not (my security/precarity, liberty, relative success in interactive situations, …).
I think you can express that in a ‘goods’ framing by glossing it as the consumption of ‘status and interactive goods’ or something like that (but noting that these are positional and contingent on the wider playing field, which the bog standard welfare theorem utility functions aren’t).