We’re probably not at the peak of the fitness landscape for mammalian intelligence, but I’d be surprised if we weren’t reasonably close to a local maximum;
This makes me wonder, if we are to create an FAI and there are other alien uFAI out there, will they be more intelligent because they had more time or is there an overall general intelligence limit? I suppose if there is a total limit to self-improvement for any kind of general intelligence, then all that matters is the acquisition of resources? So any alien uFAI who was able to acquire more raw resources, at the time our FAI reaches the upper bound for intelligence, could subdue our FAI by brute force?
So any alien uFAI who was able to acquire more raw resources, at the time our FAI reaches the upper bound for intelligence, could subdue our FAI by brute force?
No. Even assuming an overwhelming intelligence superiority it would not be possible to subdue a competing superintelligence within any physics remotely like that which we know. Except, of course, if you catch it before it is aware of your existence.
Given the capability to reach speeds of a high percentage of that of light and consume most of the resources from a star system for future expansion the speed of light will give a hard minimum limit on how much of the cosmic commons you can consume before the smarter AI can catch you.
The problem then is that having more than one superintelligence—without the ability to cooperate—will guarantee the squandering of a lot of the resources that could otherwise have been spent on fun.
This makes me wonder, if we are to create an FAI and there are other alien uFAI out there, will they be more intelligent because they had more time or is there an overall general intelligence limit? I suppose if there is a total limit to self-improvement for any kind of general intelligence, then all that matters is the acquisition of resources? So any alien uFAI who was able to acquire more raw resources, at the time our FAI reaches the upper bound for intelligence, could subdue our FAI by brute force?
No. Even assuming an overwhelming intelligence superiority it would not be possible to subdue a competing superintelligence within any physics remotely like that which we know. Except, of course, if you catch it before it is aware of your existence.
Given the capability to reach speeds of a high percentage of that of light and consume most of the resources from a star system for future expansion the speed of light will give a hard minimum limit on how much of the cosmic commons you can consume before the smarter AI can catch you.
The problem then is that having more than one superintelligence—without the ability to cooperate—will guarantee the squandering of a lot of the resources that could otherwise have been spent on fun.