[Link] Using Stories to Teach Human Values to Artificial Agents

Ab­stract:

Value al­ign­ment is a prop­erty of an in­tel­li­gent agent in­di­cat­ing that it can only pur­sue goals that are benefi­cial to hu­mans. Suc­cess­ful value al­ign­ment should en­sure that an ar­tifi­cial gen­eral in­tel­li­gence can­not in­ten­tion­ally or un­in­ten­tion­ally perform be­hav­iors that ad­versely af­fect hu­mans. This is prob­le­matic in prac­tice since it is difficult to ex­haus­tively enu­mer­ated by hu­man pro­gram­mers. In or­der for suc­cess­ful value al­ign­ment, we ar­gue that val­ues should be learned. In this pa­per, we hy­poth­e­size that an ar­tifi­cial in­tel­li­gence that can read and un­der­stand sto­ries can learn the val­ues tac­itly held by the cul­ture from which the sto­ries origi­nate. We de­scribe pre­limi­nary work on us­ing sto­ries to gen­er­ate a value-al­igned re­ward sig­nal for re­in­force­ment learn­ing agents that pre­vents psy­chotic-ap­pear­ing be­hav­ior.

-- Us­ing Sto­ries to Teach Hu­man Values to Ar­tifi­cial Agents

Com­ment by the lead re­searcher Riedl (cited on Slash­dot):

“The AI … runs many thou­sands of vir­tual simu­la­tions in which it tries out differ­ent things and gets re­warded ev­ery time it does an ac­tion similar to some­thing in the story,” said Riedl, as­so­ci­ate pro­fes­sor and di­rec­tor of the En­ter­tain­ment In­tel­li­gence Lab. “Over time, the AI learns to pre­fer do­ing cer­tain things and avoid­ing do­ing cer­tain other things. We find that Quix­ote can learn how to perform a task the same way hu­mans tend to do it. This is sig­nifi­cant be­cause if an AI were given the goal of sim­ply re­turn­ing home with a drug, it might steal the drug be­cause that takes the fewest ac­tions and uses the fewest re­sources. The point be­ing that the stan­dard met­rics for suc­cess (eg, effi­ciency) are not so­cially best.”

Quix­ote has not learned the les­son of “do not steal,” Riedl says, but “sim­ply prefers to not steal af­ter read­ing and em­u­lat­ing the sto­ries it was pro­vided.”