This implies solving a version of the alignment problem that includes reasonable value aggregation between different people (or between AIs aligned to different people),
We already have a solution to this: money. It’s also the only solution that satisfies some essential properties such as sybil orthogonality (especially important for posthuman/AGI societies).
We already have a solution to this: money. It’s also the only solution that satisfies some essential properties such as sybil orthogonality (especially important for posthuman/AGI societies).