ah, any software that you can run on computers that can cause the extinction of humanity even if humans try to prevent it would fulfill the sufficiency criterion for AGIniplav
A flight control program directing an asteroid redirection rocket, programmed to find a large asteroid and steer it to crash into Earth seems like the sort of thing which could be “software that you can run on computers that can cause the extinction of humanity” but not “AGI”.
I think it’s relevant that “kill all humans” is a much easier target than “kill all humans in such a way that you can persist and grow indefinitely without them”.
A flight control program directing an asteroid redirection rocket, programmed to find a large asteroid and steer it to crash into Earth seems like the sort of thing which could be “software that you can run on computers that can cause the extinction of humanity” but not “AGI”.
I think it’s relevant that “kill all humans” is a much easier target than “kill all humans in such a way that you can persist and grow indefinitely without them”.
Yes, and this might be a crux between “successionists” and “doomers” with highly cosmopolitan values.