Phil Goetz had a nice post about how status makes FAI difficult:
Now imagine two friendly AIs, one non-positional and one positional.
The non-positional FAI has a tough task. It wants to give everyone what it imagines they want.
But the positional FAI has an impossible task. It wants to give everyone what it is that it thinks they value, which is to be considered better than other people, or at least better than other people of the same sex. But it’s a zero-sum value.
Phil Goetz had a nice post about how status makes FAI difficult: