As someone who’s been pinning his hopes on a ‘survivable disaster’ to wake people up to the dangers, this is good news.
I doubt anything capable of destroying the world will come along significantly sooner than superintelligent AGI, and a world in which there are disasters due to AI feels like a world that is much more likely to survive compared to a world in which the whirling razorblades are invisible.
EDIT: “no fire alarm for AGI.” Oh I beg to differ, Mr. Yudkowsky. I beg to differ.
As someone who’s been pinning his hopes on a ‘survivable disaster’ to wake people up to the dangers, this is good news.
I doubt anything capable of destroying the world will come along significantly sooner than superintelligent AGI, and a world in which there are disasters due to AI feels like a world that is much more likely to survive compared to a world in which the whirling razorblades are invisible.
EDIT: “no fire alarm for AGI.” Oh I beg to differ, Mr. Yudkowsky. I beg to differ.