I suspect that you mean something like:
If there is an objective universal morality then agents converge on this universal at the limits of intelligence.
And thus perhaps paperclip maximizers have a tendency to become something else.
I suspect that you mean something like:
If there is an objective universal morality then agents converge on this universal at the limits of intelligence.
And thus perhaps paperclip maximizers have a tendency to become something else.