An AI that “valued” keeping the world looking roughly the way it does now, that was specifically instructed never to seize control of more than X number of each of several thousand different kinds of resources,
Define ‘seize control.’ Wouldn’t such an AI be motivated to understate it’s effective resources, or create other nominally-independent AIs with identical objectives and less restraint, or otherwise circumvent that factor?
Define ‘seize control.’ Wouldn’t such an AI be motivated to understate it’s effective resources, or create other nominally-independent AIs with identical objectives and less restraint, or otherwise circumvent that factor?