He is spending a tiny amount of resources to make it more likely that fanfiction will be made of him, thus nudging an infinity of worlds very slightly towards instantiating him instead of some other arbitrary goal system.
Considering that people are against paperclippers, I’d expect the best thing to do would be to make sure people are ignorant of the possibility.
But conditional on them writing fiction about superintelligences with non-Friendly goal systems in the first place he’d prefer those goal systems be about paperclips, knowing that no AI researcher in any world is going to go “so let’s make our AI maximize paperclips because, I mean, why not… wait a second! There are all these memes that tell me specifically that it’s bad to make a paperclip maximizer!” instead of “haha, let’s literally make a paperclip maximizer since culture has primed me to do it”.
Considering that people are against paperclippers, I’d expect the best thing to do would be to make sure people are ignorant of the possibility.
But conditional on them writing fiction about superintelligences with non-Friendly goal systems in the first place he’d prefer those goal systems be about paperclips, knowing that no AI researcher in any world is going to go “so let’s make our AI maximize paperclips because, I mean, why not… wait a second! There are all these memes that tell me specifically that it’s bad to make a paperclip maximizer!” instead of “haha, let’s literally make a paperclip maximizer since culture has primed me to do it”.