What would you want to do with the control groups? Teach them that AGI won’t destroy the world? Not teach them anything in particular about AI? Teach them that invading aliens will destroy the world, or that the biblical End Times are near? Any of these would yield useful information. Which one(s) do you favor?
Not teach them anything in particular about AI? Teach them that...the biblical End Times are near?
I was specifically thinking of these exact two conditions, which is why I said “groups”, for they are different in kind. The aliens example is even better than the supernatural end times one.
I thought of but rejected “Teach them that AGI won’t destroy the world?” when I couldn’t think of how to implement that neutrally. How would one do that?
I thought of but rejected “Teach them that AGI won’t destroy the world?” when I couldn’t think of how to implement that neutrally.
True. Most arguments against the AGI-apocalypse scenario are responses to arguments for it; it would be difficult to present only one side of the question.
What would you want to do with the control groups? Teach them that AGI won’t destroy the world? Not teach them anything in particular about AI? Teach them that invading aliens will destroy the world, or that the biblical End Times are near? Any of these would yield useful information. Which one(s) do you favor?
I was specifically thinking of these exact two conditions, which is why I said “groups”, for they are different in kind. The aliens example is even better than the supernatural end times one.
I thought of but rejected “Teach them that AGI won’t destroy the world?” when I couldn’t think of how to implement that neutrally. How would one do that?
True. Most arguments against the AGI-apocalypse scenario are responses to arguments for it; it would be difficult to present only one side of the question.