Prediction Thread: Make Predictions About How Different Factors Affect AGI X-Risk.

In this post you can make several predictions for how different factors affect the probability that the creation of AGI leads to an extinction level catastrophe. This might be useful for planning.

Please let me know if you have other ideas for questions that could be valuable to ask.

10%20%30%40%50%60%70%80%90%
If AGI is developed before 2100, what is the probability it will cause an extinction level catastrophe?

Predictions based on who develops AGI:

10%20%30%40%50%60%70%80%90%
If AGI is developed on January 1st, 2030, what is the probability it will cause an extinction level catastrophe?
10%20%30%40%50%60%70%80%90%
If AGI is developed on January 1st 2030 by either Google, Microsoft (including OpenAI) or Meta, what is the probability it will cause an extinction level catastrophe?
10%20%30%40%50%60%70%80%90%
If AGI is developed on January 1st 2030 by the US Government/military, what is the probability it will cause an extinction level catastrophe?
10%20%30%40%50%60%70%80%90%
If AGI is developed on January 1st 2030 by researchers at a University with no close ties to Big Tech or any military, what is the probability it will cause an extinction level catastrophe?

Predictions based on technology used for developing AGI:

10%20%30%40%50%60%70%80%90%
If AGI is developed on January 1st 2030 using RLHF similar to today, and no other breakthrough innovation, what is the probability it will cause an extinction level catastrophe?
10%20%30%40%50%60%70%80%90%
If AGI is developed on January 1st 2030 using a new paradigm of ML than what we have today, what is the probability it will cause an extinction level catastrophe?

Prediction based on approach for creating AGI:

10%20%30%40%50%60%70%80%90%
If an AGI is developed on January 1st 2030 with the sole task of being an oracle, and it acts like an oracle during training, what is the probability it will cause an extinction level catastrophe?
10%20%30%40%50%60%70%80%90%
If an AGI is developed on January 1st 2030 with the purpose of being a general assistant/worker, what is the probability it will cause an extinction level catastrophe?

Predictions on how money affects probability of AGI X-risk:

10%20%30%40%50%60%70%80%90%
If today (27th of February) someone donated $10 billion purely based on advice from leading AI alignment researcher, how much would the risk of an AGI caused extinction level catastrophe decrease? (Ex if the risk goes from 50% to 49%, the risk decreased by 2%)
10%20%30%40%50%60%70%80%90%
If today (27th of February) someone donated $1 trillion purely based on advice from leading AI alignment researcher, how much would the risk of an AGI caused extinction level catastrophe decrease?