As mentioned by Carl above, a wireheading AI might still want to exist rather than not exist. So if there’s some risk you could turn it off or nuke its building or something, it would do its best to neutralize that risk. An alternate danger is that the wireheading could take the form of storing a very large number—and the more resources you have, the bigger the number you can store.
As mentioned by Carl above, a wireheading AI might still want to exist rather than not exist. So if there’s some risk you could turn it off or nuke its building or something, it would do its best to neutralize that risk. An alternate danger is that the wireheading could take the form of storing a very large number—and the more resources you have, the bigger the number you can store.