If the agent had no power whatsoever to effect the world then it wouldn’t matter if it cared or not.
So the real desire is that it must have a sufficient amount, but not over some threshold that will prove to be too frightening.
Who gets to decide this threhsold?
An AGI can kill you even if it’s not beyond what you consider to be “too frightening”.
The grading isn’t on a scale.
The threshold still has to be greater than zero power for its ‘care’ to matter one way or the other. And the risk that you mention needs to be accepted as part of the package, so to speak.
So who gets to decide where to place it above zero?
If the agent had no power whatsoever to effect the world then it wouldn’t matter if it cared or not.
So the real desire is that it must have a sufficient amount, but not over some threshold that will prove to be too frightening.
Who gets to decide this threhsold?
An AGI can kill you even if it’s not beyond what you consider to be “too frightening”.
The grading isn’t on a scale.
The threshold still has to be greater than zero power for its ‘care’ to matter one way or the other. And the risk that you mention needs to be accepted as part of the package, so to speak.
So who gets to decide where to place it above zero?