And it would probably want to have it’s own copies be maximized as well [...] This means it would have to consider itself a form of paperclip
That’s the problematic step. If maximizing copies of itself if what maximizes paperclips, it happens automatically. It doesn’t have to decide “paperclips” stands for “paperclips and the 837 things I’ve found maximize them”. It notices “making copies leads to more paperclips than self-destructing into paperclips”, and moves on. Like you’re not afraid that, if you don’t believe growing cocoa beans is inherently virtuous, you might try to disassemble farms and build chocolate from their atoms.
I think I see what you’re getting at. It’s more in the vein of solving a logic/physics problem at that point. The only reason it would make the consideration I referred to would be if by making that consideration, it could make more paperclips, so it would come down to which type of replication code allowed for less effort to be spent on maximizers and more effort to be spent on paperclips over the time period considered.
That’s the problematic step. If maximizing copies of itself if what maximizes paperclips, it happens automatically. It doesn’t have to decide “paperclips” stands for “paperclips and the 837 things I’ve found maximize them”. It notices “making copies leads to more paperclips than self-destructing into paperclips”, and moves on. Like you’re not afraid that, if you don’t believe growing cocoa beans is inherently virtuous, you might try to disassemble farms and build chocolate from their atoms.
I think I see what you’re getting at. It’s more in the vein of solving a logic/physics problem at that point. The only reason it would make the consideration I referred to would be if by making that consideration, it could make more paperclips, so it would come down to which type of replication code allowed for less effort to be spent on maximizers and more effort to be spent on paperclips over the time period considered.