Humanity grows saner, SIAI is well funded, and successfully develops a FAI. Just as the AI finishes calculating the CEV of humanity, a stray cosmic ray flips the sign bit in its utility function. It proceeds to implement the anti-CEV of humanity for the lifetime of the universe.
(Personally I think contrived-ness only detracts from the raw emotional impact that such scenarios have if you ignore their probability and focus on the outcome.)
Humanity grows saner, SIAI is well funded, and successfully develops a FAI. Just as the AI finishes calculating the CEV of humanity, a stray cosmic ray flips the sign bit in its utility function. It proceeds to implement the anti-CEV of humanity for the lifetime of the universe.
(Personally I think contrived-ness only detracts from the raw emotional impact that such scenarios have if you ignore their probability and focus on the outcome.)
I actually use “the sign bit in the utility function” as one of my canonical examples for how not to design an AI.