How about ‘out-of-control superintelligence’? (Either because it’s uncontrollable or at least not controlled.) Which carries the appropriately alarming connotations that it’s doing its own thing and that we can’t stop it (or aren’t doing so anyway)
I think this may be proving Raemon’s point that there are a wide range of concepts. I consider the lower amount of alignment connotation of independence gaining a feature, not a bug, since we can say things like ethical independence gaining AGI or aligned independence gaining AGI without it sounding like an oxymoron. Also, I am not sure superintelligence is required to gain independence, since it may be possible to just think longer than a human to gain independence without thinking faster. That said, if out-of-control superintelligence is the right concept you are trying to get across, then use that.
How about ‘out-of-control superintelligence’? (Either because it’s uncontrollable or at least not controlled.) Which carries the appropriately alarming connotations that it’s doing its own thing and that we can’t stop it (or aren’t doing so anyway)
I think this may be proving Raemon’s point that there are a wide range of concepts. I consider the lower amount of alignment connotation of independence gaining a feature, not a bug, since we can say things like ethical independence gaining AGI or aligned independence gaining AGI without it sounding like an oxymoron. Also, I am not sure superintelligence is required to gain independence, since it may be possible to just think longer than a human to gain independence without thinking faster. That said, if out-of-control superintelligence is the right concept you are trying to get across, then use that.