In the (season) semester of (year), I decided to see how effectively I could communicate the idea of a threat from (noun) to my undergraduate classes. I spent three sessions on this for each of my two classes. My goal was to convince my students that all of us are going to be killed by a(n) (noun). My strategy was to induce the students to come up with the ideas themselves. I gave out a survey before and after. An analysis of the survey responses indicates that the students underwent a statistically significant shift in their reported attitudes. After the three sessions, students reported believing that (noun) would have a larger impact1 and also a worse impact2 than they originally reported believing.
This is less than surprising. I can’t think of any threat already existing in the minds of some undergraduates that a competent believing professor requiring attendance couldn’t, on average, increase. Control groups are needed.
What would you want to do with the control groups? Teach them that AGI won’t destroy the world? Not teach them anything in particular about AI? Teach them that invading aliens will destroy the world, or that the biblical End Times are near? Any of these would yield useful information. Which one(s) do you favor?
Not teach them anything in particular about AI? Teach them that...the biblical End Times are near?
I was specifically thinking of these exact two conditions, which is why I said “groups”, for they are different in kind. The aliens example is even better than the supernatural end times one.
I thought of but rejected “Teach them that AGI won’t destroy the world?” when I couldn’t think of how to implement that neutrally. How would one do that?
I thought of but rejected “Teach them that AGI won’t destroy the world?” when I couldn’t think of how to implement that neutrally.
True. Most arguments against the AGI-apocalypse scenario are responses to arguments for it; it would be difficult to present only one side of the question.
This is less than surprising. I can’t think of any threat already existing in the minds of some undergraduates that a competent believing professor requiring attendance couldn’t, on average, increase. Control groups are needed.
What would you want to do with the control groups? Teach them that AGI won’t destroy the world? Not teach them anything in particular about AI? Teach them that invading aliens will destroy the world, or that the biblical End Times are near? Any of these would yield useful information. Which one(s) do you favor?
I was specifically thinking of these exact two conditions, which is why I said “groups”, for they are different in kind. The aliens example is even better than the supernatural end times one.
I thought of but rejected “Teach them that AGI won’t destroy the world?” when I couldn’t think of how to implement that neutrally. How would one do that?
True. Most arguments against the AGI-apocalypse scenario are responses to arguments for it; it would be difficult to present only one side of the question.