Separately, no singularity plausibly implies we lose most value in the universe from the longtermist perspective.
I don’t really see why that would be the case. We can still go out and settle space, end disease, etc. it just means that starkly superintelligent machines turn out to not be alien minds by nature of their superintelligence.
I don’t really see why that would be the case. We can still go out and settle space, end disease, etc. it just means that starkly superintelligent machines turn out to not be alien minds by nature of their superintelligence.
(Sorry, I edited my comment because it was originally very unclear/misleading/wrong, does the edited version make more sense?)