(Mostly I don’t particularly think there should particularly should be a common pDoom-ish question, but insofar as I was object-level answering your question here, an answer I think feels right-ish is “ensure AI x-risk is no higher than around the background risk of nuclear war”)
Seems somewhat surprising for that to be what Eliezer had in mind, given the many episodes of people saying stuff like “MIRI’s perspective makes sense if we wanted to guarantee that there wasn’t any risk” and Eliezer saying stuff like “no I’d take any solution that gave <=50% doom”. (At least that’s what I remember, though I can’t find sources now.)
I do agree that’s enough evidence to be confused and dissatisfied with my guess. I’m basing my guess more on the phrasing of question, which sounds more like it’s just meaning to be “what a reasonable person would think ‘prevent exinction’ would mean”, and, the fact that Eliezer said that-sort-of-thing in another context doesn’t necessarily mean it’s what he meant here.”
(Mostly I don’t particularly think there should particularly should be a common pDoom-ish question, but insofar as I was object-level answering your question here, an answer I think feels right-ish is “ensure AI x-risk is no higher than around the background risk of nuclear war”)
Seems somewhat surprising for that to be what Eliezer had in mind, given the many episodes of people saying stuff like “MIRI’s perspective makes sense if we wanted to guarantee that there wasn’t any risk” and Eliezer saying stuff like “no I’d take any solution that gave <=50% doom”. (At least that’s what I remember, though I can’t find sources now.)
I do agree that’s enough evidence to be confused and dissatisfied with my guess. I’m basing my guess more on the phrasing of question, which sounds more like it’s just meaning to be “what a reasonable person would think ‘prevent exinction’ would mean”, and, the fact that Eliezer said that-sort-of-thing in another context doesn’t necessarily mean it’s what he meant here.”