The problem is that a paperclip maximiser has paperclip maximisation as its primary goal, IE. it doesn’t have
i) never harming anyone,
ii) obeying humans or
iii) shutting down when told to
....as a primary goal...effectively, it has been designed without an off switch. That is still true for any value of “paperclip”.
The problem is that a paperclip maximiser has paperclip maximisation as its primary goal, IE. it doesn’t have
i) never harming anyone,
ii) obeying humans or
iii) shutting down when told to
....as a primary goal...effectively, it has been designed without an off switch. That is still true for any value of “paperclip”.