I have a hard time imagining a filter that could’ve wiped out all of a large number of civilizations that got to our current point or further. That’s not to say that future x-risks aren’t an issue—it just feels implausible that no civilization would’ve been able to successfully coordinate with regard to them or avoid developing them. (E.g. bonobos seem substantially more altruistic than humans and are one of the most intelligent non-human species.)
Also, I thought of an interesting response to the Great Filter, assuming that we’re pretty sure it actually is ahead of us: Halt all technological development ASAP and stay on this planet chilling out. It’s possible that other civilizations already have done this (having realized that the Great Filter was an issue and there was likely deadly tech in their future)--if they had, we wouldn’t know about it.
I have a hard time imagining a filter that could’ve wiped out all of a large number of civilizations that got to our current point or further. That’s not to say that future x-risks aren’t an issue—it just feels implausible that no civilization would’ve been able to successfully coordinate with regard to them or avoid developing them. (E.g. bonobos seem substantially more altruistic than humans and are one of the most intelligent non-human species.)
Also, I thought of an interesting response to the Great Filter, assuming that we’re pretty sure it actually is ahead of us: Halt all technological development ASAP and stay on this planet chilling out. It’s possible that other civilizations already have done this (having realized that the Great Filter was an issue and there was likely deadly tech in their future)--if they had, we wouldn’t know about it.