Caught in the glare of two anthropic shadows
This article consists of original new research, so would not get published on Wikipedia!
The previous post introduced the concept of the anthropic shadow: the fact that certain large and devastating disasters cannot be observed in the historical record, because if they had happened, we wouldn’t be around to observe them. This absence forms an “anthropic shadow”.
But that was the result for a single category of disasters. What would happen if we consider two independent classes of disasters? Would we see a double shadow, or would one ‘overshadow’ the other?
To answer that question, we’re going to have to analyse the anthropic shadow in more detail, and see that there are two separate components to it:
The first is the standard effect: humanity cannot have developed a technological civilization, if there were large catastrophes in the recent past.
The second effect is the lineage effect: humanity cannot have developed a technological civilization, if there was another technological civilization in the recent past that survived to today (or at least, we couldn’t have developed the way we did).
To illustrate the difference between the two, consider the following model. Segment time into arbitrarily “eras”. In a given era, a large disaster may hit with probability q, or a small disaster may independently hit with probability q (hence with probability q2, there will be both a large and a small disaster). A small disaster will prevent a technological civilization from developing during that era; a large one will prevent such a civilization from developing in that era or the next one.
If it is possible for a technological civilization to develop (no small disasters that era, no large ones in the preceding era, and no previous civilization), then one will do so with probability p. We will assume p constant: our model will only span a time frame where p is unchanging (maybe it’s over the time period after the rise of big mammals?)
Assume a technological civilization develops during a given era (in which, of course, there is no disasters). Let _ denotes no disaster, ▄ denotes a small disaster only, and █ denotes a large disaster (with or without a small disaster as well). Then the possible past sequences that end in the current era (which is a _ by definition) can be divided into sequences that end in the following ways (the anthropic shadow is the first row):
|█ ▄ _||q2(1-q)||q2(1-q)/Ω|
|▄ ▄ _ or _ ▄ _||q(1-q)2||q(1-q)2/Ω|
|█ _ … _ or ▄ _ … _||(1-q)2||(1-(1-q)2)*((1-q2)(1-p)/(1-(1-q2)(1-p)))/Ω|
The first column gives the probabilities of these various pasts, without any anthropic correction. The second column applies the anthropic correction, essentially changing some of the probabilities and then renormalising by Ω, which is the sum of the remaining probabilities, namely q2(1-q) + q(1-q)2 + (1-(1-q)2)*((1-q2)(1-p)/(1-(1-q2)(1-p))). Don’t be impressed by the complexity of the formula: the ideas are key.
The standard anthropic effect rules out the second row: we can’t have large disasters in the previous era. The lineage effect reduces the probability of the fourth row: we have less chance of developing a technological civilization in this era, if there are many opportunities for a previous one to develop. Both these effects increase the relative probability of the first row, which is the anthropic shadow.
The first thing to note is that the standard effect is very strong for high q. If q is very close to 1, then the third and fourth rows, being multiples of (1-q)2, are much less likely that the first row, which has a single power of (1-q). Hence an anthropic shadow is nearly certain.
The lineage effect is weaker. Even if p=1, the only effect is to rule out the fourth row. The first and the third row remain as possibilities, with ratios of q2:q(1-q): hence we still need a reasonable q to get an anthropic shadow. If we break down the scale of the disaster into more than two components, and insist on a strict anthropic shadow (regularly diminishing disaster intensity), then the lineage effect becomes very weak indeed. If the data is poor, though, or if we allow approximate shadows (if for instance we consider the third row as an anthropic shadow: civilization appeared as soon after the small disaster as it could), then the lineage effect can be significant.
What has this got to do with multiple disasters? If we look at meteor impacts and (non-meteor-caused) supervolcanoes, what should we expect to see? The simple rule is that the standard anthropic shadows of two classes of disasters combine (we see two anthropic shadows), while the lineage effect of the most devastating disaster dominates.
How can we see this?
Let’s ignore the lineage effect for the moment, and let q and r be the probabilities for the two classes of disasters. Then instead of having four situations, as above, we have sixteen: four each for the disasters, independently. We can represent this by the following table:
Then applying standard anthropic effects means removing the second row, then removing the second column (or vice versa), and then renormalising. The anthropic effect for the first class of disasters moves the probability of its anthropic shadows from q2(1-q) to q2. While the joint probability of the combined anthropic shadow (the top left box in red) is moved from q2(1-q)*r2(1-r) to q2*r2. The two shadows are independent.
In contrast, the lineage effect rules out long series of disaster-free eras before our own era. If there is a possibility for a long series (q and r being low), then the lineage effect makes a big impact. But the likely number of disaster-free eras is determined by the highest of q and r. If q and r are both 1%, then we’d expect something like 25 disaster-free eras in a row. If q is 10% and r is 1%, then we’d expect something like 5 disaster-free eras in a row—the same as if r was zero, and we were facing a single disaster.
So though the real-world data is probably too poor to conclude anything, the above result raises the possibility that we could estimate the lineage effect independently, by looking at the anthropic shadows of unrelated disasters—and thus get another way of calculating the probability of technological civilizations such as our own emerging.