There is social pressure to hide X. So, X turns out to be much more common and much less extreme than one naively imagined. The net effect is already out there, so maybe just chill.
But, the above story exists in equilibrium with the reverse reaction:
There is social pressure to hide X. It turns out that X is much more common than one naively imagined, and although the average instance of X is not so extreme, the system is actually about to collapse under the cumulative weight of X, and almost nobody is aware until it happens.
A story like this is revealed at the end of every business cycle, where X is some form of corruption we previously thought was held in check by the social pressure against X (which turned out to be insufficiently harsh). Like, for example, approving loans to people we know are unlikely to pay them back.
Toby Ord and I wrote a paper that describes how “expanding cosmological civilizations” (my less-snappy but more descriptive term for “grabby civilizations”) update our estimates of any late filters you might want to consider—assuming SIA as anthropic school: https://arxiv.org/abs/2106.13348
Basically, suppose you have some prior pdf P(q) on the probability “q” that we pass any late filter. Then, considering expanding civilizations will tell you to update it to P(q) --> P(q)/q. And this isn’t good, since it upweights low values of “q” (i.e. lower survival probability).
A general argument analogous to this was actually advanced by Katja Grace, long before we started studying expanding cosmological civilizations—just in the context of regular galactic SETI. But the geometry of ECC’s gives it a surprisingly simple, quantifiable form.