What you’re pointing at applies if AI merely makes most work obsolete without significantly disturbing the social order otherwise, but you’re not considering (also historically common) replacement/displacement scenarios. It is clearly bad from my perspective if (e.g.) either: 1) Controllable strong AI gets used to takeover the world and in time replace the human population by the dictators offspring. 2) Humans get displaced by AIs.
In either case, the surviving parties may well look back on the current state of affairs and consider their world much improved, but it’s likely we wouldn’t on reflection.
What you’re pointing at applies if AI merely makes most work obsolete without significantly disturbing the social order otherwise, but you’re not considering (also historically common) replacement/displacement scenarios. It is clearly bad from my perspective if (e.g.) either:
1) Controllable strong AI gets used to takeover the world and in time replace the human population by the dictators offspring.
2) Humans get displaced by AIs.
In either case, the surviving parties may well look back on the current state of affairs and consider their world much improved, but it’s likely we wouldn’t on reflection.