I like Yudkowsky’s toy example of tasking an AGI to copy a single strawberry, on a molecular level, without destroying the world as a side-effect.
I also like it for this reason, though I personally think that a lot of the challenge is in being capable enough to do it, rather than us not being able to make it destroy the world.
Still, I kind of like the toy example.
I like Yudkowsky’s toy example of tasking an AGI to copy a single strawberry, on a molecular level, without destroying the world as a side-effect.
I also like it for this reason, though I personally think that a lot of the challenge is in being capable enough to do it, rather than us not being able to make it destroy the world.
Still, I kind of like the toy example.