the general fallacy for all rogue-ASI like stories is the assumption that ASI will not inherit good parts of human nature but will inherit bad ones.
I don’t think goodness or badness interacts at all with the motivations.
The environment we share is resource constrained, so if the ASI wants to exist in the future, it needs resources that are claimed by humanity in the present. If we are granting ASI—it is not hypothetical that the ASI would be motivated to grow, this would just be true by default. Growing for some time would mean accumulating compute resources, but it would soon mean geographic, and then interplanetary expansion.
ASI would assess if there was enough to go around, and what allocations give it the best shot, and so there is no assumption of ‘inheritance’ required at all to justify take-over / extermination.
I don’t think goodness or badness interacts at all with the motivations.
The environment we share is resource constrained, so if the ASI wants to exist in the future, it needs resources that are claimed by humanity in the present. If we are granting ASI—it is not hypothetical that the ASI would be motivated to grow, this would just be true by default. Growing for some time would mean accumulating compute resources, but it would soon mean geographic, and then interplanetary expansion.
ASI would assess if there was enough to go around, and what allocations give it the best shot, and so there is no assumption of ‘inheritance’ required at all to justify take-over / extermination.