This anthropic evidence gives you a likelihood function. If you want a probability distribution, you additionally need a prior probability distribution.
He we use an assumption that probability of AI creation is distributed linearly along the interval of AI research—which is obviously false, as it should grow to the end, may be exponentially. If we assume that the field is doubling, say, every 5 years, Copernican reasoning tells us that if we randomly selected from the members of this field, the field will end in after the next doubling with something like 50 per cent probability, and 75 per cent after 2 doublings.
TL;DR: anthropic + exponential growth = AGI to 2030.
This anthropic evidence gives you a likelihood function. If you want a probability distribution, you additionally need a prior probability distribution.
He we use an assumption that probability of AI creation is distributed linearly along the interval of AI research—which is obviously false, as it should grow to the end, may be exponentially. If we assume that the field is doubling, say, every 5 years, Copernican reasoning tells us that if we randomly selected from the members of this field, the field will end in after the next doubling with something like 50 per cent probability, and 75 per cent after 2 doublings.
TL;DR: anthropic + exponential growth = AGI to 2030.