But conditional on them writing fiction about superintelligences with non-Friendly goal systems in the first place he’d prefer those goal systems be about paperclips, knowing that no AI researcher in any world is going to go “so let’s make our AI maximize paperclips because, I mean, why not… wait a second! There are all these memes that tell me specifically that it’s bad to make a paperclip maximizer!” instead of “haha, let’s literally make a paperclip maximizer since culture has primed me to do it”.
But conditional on them writing fiction about superintelligences with non-Friendly goal systems in the first place he’d prefer those goal systems be about paperclips, knowing that no AI researcher in any world is going to go “so let’s make our AI maximize paperclips because, I mean, why not… wait a second! There are all these memes that tell me specifically that it’s bad to make a paperclip maximizer!” instead of “haha, let’s literally make a paperclip maximizer since culture has primed me to do it”.