If there in fact something morally wrong about releasing the tech (your summary doesn’t indicate it clearly, but I’d expect it from most drastic actions Robin seems like he would be disposed to take), you can prevent it by, if necessary, murderously wielding a puppy, since attempting to release the tech would be a contextually relevant wrong act. Even if I thought it was obligatory to stop you, I might not do it. I’m imperfect.
If there in fact something morally wrong about releasing the tech
I don’t know about morals, but I hope it was clear that the consequences were assigned a low expected utility. The potential concern would be that your morals interfered with me seeking desirable future outcomes for the planet.
Aw, thanks...?
If there in fact something morally wrong about releasing the tech (your summary doesn’t indicate it clearly, but I’d expect it from most drastic actions Robin seems like he would be disposed to take), you can prevent it by, if necessary, murderously wielding a puppy, since attempting to release the tech would be a contextually relevant wrong act. Even if I thought it was obligatory to stop you, I might not do it. I’m imperfect.
That is promising. Would you let me kill Dave too?
If you’re in the room with Dave, why wouldn’t you just push the AI’s reset button yourself?
See link. Depends on how I think he would update. I would kill him too if necessary.
I don’t know about morals, but I hope it was clear that the consequences were assigned a low expected utility. The potential concern would be that your morals interfered with me seeking desirable future outcomes for the planet.