say more about why it would be suicide? it seems to me that it would only be able to succeed through diplomacy, and would take decades before enough trust had been established. but I see no fundamental reason that war and conflict couldn’t be near completely ended forever, once the diffusion of coprotective strategies is thorough enough.
Certainly pivotal acts are intense acts of war and beget retaliation, as you say.
My definition of a guardian AI : machine allowed to do anything it wants, but we tried to make it “be good”. It has military capabilities, can self improve, and we have no edit access.
say more about why it would be suicide? it seems to me that it would only be able to succeed through diplomacy, and would take decades before enough trust had been established. but I see no fundamental reason that war and conflict couldn’t be near completely ended forever, once the diffusion of coprotective strategies is thorough enough.
Certainly pivotal acts are intense acts of war and beget retaliation, as you say.
My definition of a guardian AI : machine allowed to do anything it wants, but we tried to make it “be good”. It has military capabilities, can self improve, and we have no edit access.
That’s suicide. Murder pills etc.