I suppose there is a couple of main points to get across. One (assuming you subscribe to the whole FOOM thing) is that once an AGI is nearly as smart as a human, it will take no time flat for it to become infinitely smarter than humans. The other is that this infinitely smart AGI is likely to treat humanity as animals in a zoo at best, and as a nuisance like a mosquito to get rid of at worst. I am not sure what metaphor or comparison would would well for the FOOM idea to bridge the inferential distance, but one is sorely needed for any sort of elevator pitch.
I suppose there is a couple of main points to get across. One (assuming you subscribe to the whole FOOM thing) is that once an AGI is nearly as smart as a human, it will take no time flat for it to become infinitely smarter than humans. The other is that this infinitely smart AGI is likely to treat humanity as animals in a zoo at best, and as a nuisance like a mosquito to get rid of at worst. I am not sure what metaphor or comparison would would well for the FOOM idea to bridge the inferential distance, but one is sorely needed for any sort of elevator pitch.