If we forget a dimension, like “AGI, please remember we don’t like getting bored”, then things go badly, even if we added another fake dimension which wasn’t related to boredom.
If we train the AI on data from our current world, then [almost?] certainly it will see new things when it runs for real. As a toy (not realistic but I think correct) example: the AI will give everyone a personal airplane, and then it will have to deal with a world that has lots of airplanes.
Seems like two separate things (?)
If we forget a dimension, like “AGI, please remember we don’t like getting bored”, then things go badly, even if we added another fake dimension which wasn’t related to boredom.
If we train the AI on data from our current world, then [almost?] certainly it will see new things when it runs for real. As a toy (not realistic but I think correct) example: the AI will give everyone a personal airplane, and then it will have to deal with a world that has lots of airplanes.