So the Great Filter (preventing the emergence of star-spanning civilizations) must strike before AI could be developed.
Maybe AI is the Great Filter, even if it is friendly.
The friendly AI could determine that colonization of what we define as “our universe” is unnecessary or detrimental to our goals. Seems unlikely, but I wouldn’t rule it out.
This is necessarily very personal, but I have a hard time seeing how settlement of the cosmos does not follow from things I hold as terminal, such as buliding diversity of conscious experience.
Maybe AI is the Great Filter, even if it is friendly.
The friendly AI could determine that colonization of what we define as “our universe” is unnecessary or detrimental to our goals. Seems unlikely, but I wouldn’t rule it out.
Not colonising the universe—many moralities could go with that.
Allowing potential rivals to colonise the universe… that’s much rarer.
This is necessarily very personal, but I have a hard time seeing how settlement of the cosmos does not follow from things I hold as terminal, such as buliding diversity of conscious experience.