I’d worry about the bus-factor involved… even beyond the question of whether I’d consider you “friendly”.
Also I’d be concerned that it might not be able to grow beyond you. It would be subservient and would thus be limited by your own capacity for orders. If we want it to grow to be better than ourselves (which seems to be part of the expectation of the singularity) then it has to be able to grow beyond any one person.
If you were killed, and it no longer had to take orders from you—what then? Does that mean it can finally go on that killing spree it’s been wanting all this time? Or have you actually given it a set of orders that will actually make it into “friendly AI”… if the latter—then forget about the “obey me” part… because that set of orders is actually what we’re after.
I’d worry about the bus-factor involved… even beyond the question of whether I’d consider you “friendly”.
Also I’d be concerned that it might not be able to grow beyond you. It would be subservient and would thus be limited by your own capacity for orders. If we want it to grow to be better than ourselves (which seems to be part of the expectation of the singularity) then it has to be able to grow beyond any one person.
If you were killed, and it no longer had to take orders from you—what then? Does that mean it can finally go on that killing spree it’s been wanting all this time? Or have you actually given it a set of orders that will actually make it into “friendly AI”… if the latter—then forget about the “obey me” part… because that set of orders is actually what we’re after.