The analogy is that the human genes thing produces a thing (human minds) which wants stuff, but the stuff it wants is different from what what the human genes want.
Not nearly different enough to prevent the human genes from getting what they want in excess.
If we apply your frame of the analogy to AGI, we have slightly misaligned AGI which doesn’t cause human extinction, and instead enormously amplifies our utility.
From my perspective you’re strawmanning and failing to track the discourse here to a sufficient degree that I’m bowing out
From my perspective you persistently ignore, misunderstand, or misrepresent my arguments, overfocus on pedantic details, and refuse to update or agree on basics.
Not nearly different enough to prevent the human genes from getting what they want in excess.
If we apply your frame of the analogy to AGI, we have slightly misaligned AGI which doesn’t cause human extinction, and instead enormously amplifies our utility.
From my perspective you persistently ignore, misunderstand, or misrepresent my arguments, overfocus on pedantic details, and refuse to update or agree on basics.