I’ve been using something like “A self-optimizing AI would be so powerful that it will just roll over the human race unless it’s programmed to not do that.”
Any others?
Short versions of the basic premise about FAI
I’ve been using something like “A self-optimizing AI would be so powerful that it will just roll over the human race unless it’s programmed to not do that.”
Any others?