To spell this out: if someone makes an AGI with poor arithmetical ability, and doesn’t keep their research secret, someone else can just write a version without that flaw. (They might not even need to add a fundamentally different routine.) And that’s if the AI itself has severely limited self-modifying ability.
To spell this out: if someone makes an AGI with poor arithmetical ability, and doesn’t keep their research secret, someone else can just write a version without that flaw. (They might not even need to add a fundamentally different routine.) And that’s if the AI itself has severely limited self-modifying ability.