[Question] Poll: Which variables are most strategically relevant?

Which variables are most important for predicting and influencing how AI goes?

Here are some examples:

  • Timelines: “When will crazy AI stuff start to happen?”

  • Alignment tax: “How much more difficult will it be to create an aligned AI vs an unaligned AI when it becomes possible to create powerful AI?

  • Homogeneity: “Will transformative AI systems be trained/​created all in the same way?”

  • Unipolar /​ Multipolar: “Will transformative AI systems be controlled by one organization or many?”

  • Takeoff speeds: “Will takeoff be fast or slow (or hard or soft, etc.)?”

We made this question to crowd-source more entries for our list, along with operationalizations and judgments of relative importance. This is the first step of a larger project.

Instructions:

  1. Answers should be variables that are importantly different from the previous answers. It’s OK if there’s some overlap or correlation. If your variable is too similar to a previous answer, instead of making a new answer, comment on the previous answer with your preferred version. We leave it to your judgment to decide how similar is too similar.

  2. Good operationalizations are important. If you can give one or more as part of your answer, great! If you can’t, don’t let that stop you from answering anyway. If you have a good operationalization for someone’s variable, add your operationalization as a comment to that variable.

  3. Upvote variables that you think are important, and strong-upvote variables that you think are very important. You can also downvote variables that you think are unimportant or overrated.

  4. The relevant sense of importance is importance for predicting and influencing how AI goes. For example, “Will AIs in the long-term future be aligned?” is important in some sense, but not that helpful to think about, so shouldn’t score highly here.