Does mankind have a duty to warn extraterrestrial civilizations that we might someday unintentionally build an unfriendly super-intelligent AI that expands at the speed of light gobbling up everything in its path?
One response to such a warning would be to build a super-intelligent AI that expands at the speed of light gobbling up everything in its path first.
And when the two (or more) collide, it would make a nice SF story :-)
This wouldn’t be a horrible outcome, because the two civilizations light-cones would never fully intersect. Neither civilization would fully destroy the other.
The light cones might not fully intersect, but humans do not expand at close to the speed of light. It’s enough to be able to destroy the populated planets.
One response to such a warning would be to build a super-intelligent AI that expands at the speed of light gobbling up everything in its path first.
And when the two (or more) collide, it would make a nice SF story :-)
This wouldn’t be a horrible outcome, because the two civilizations light-cones would never fully intersect. Neither civilization would fully destroy the other.
Are you crazy! Think of all the potential paperclips that wouldn’t come into being!!
The light cones might not fully intersect, but humans do not expand at close to the speed of light. It’s enough to be able to destroy the populated planets.