It’s a very good example. It also illustrates how hard is to specify a useful utility function for an AGI: “get me to my destination and don’t kill anyone or cause any damage on the way” can lead to a number of non-obvious unintended consequences, compared to the CEV version “drive as a human would drive if that human were faster-thinking, calmer, clearer-minded, more focused; had sharper eyes, better knowledge of the roads and hazards, better ability to cooperate with other drivers”.
It’s a very good example. It also illustrates how hard is to specify a useful utility function for an AGI: “get me to my destination and don’t kill anyone or cause any damage on the way” can lead to a number of non-obvious unintended consequences, compared to the CEV version “drive as a human would drive if that human were faster-thinking, calmer, clearer-minded, more focused; had sharper eyes, better knowledge of the roads and hazards, better ability to cooperate with other drivers”.