This page is to collect links to criticism of what seems to be the prevailing view on Less Wrong regarding the possibility of hard takeoff and the importance of FAI.
This thread has a more up-to-date list of critiques.
Criticism
Thoughts on the Singularity Institute (SI) (now MIRI) from Holden Karnofsky. Responses: 1, 2, 3, 4
The Singularity Institute’s Scary Idea (and Why I Don’t Buy It). The author describes Nick Bostrom as another skeptic regarding the “scary idea”; you can see a question-and-answer session Nick gives on these topics here. (Tip for watching video lectures: you may wish to download the video using an extension for your browser and watch it sped up using VLC.)
SIA says AI is no big threat. You can also read everything the author has written about AI.
Alexander Kruel’s writing on MIRI and Less Wrong is generally critical.
Three arguments against the singularity. Reaction from Robin Hanson.
Is an Intelligence Explosion a Disjunctive or Conjunctive Event?
Why an Intelligence Explosion might be a Low-Priority Global Risk.
Why We Can’t Take Expected Value Estimates Literally (Even When They’re Unbiased). Responses: 1, 2. (You may also wish to view the author’s Less Wrong user page, as his criticisms of SI feature heavily there.)
Debate
The Hanson-Yudkowsky AI-Foom Debate, a text debate that occurred in late 2008.
2011 Hanson-Yudkowsky live AI Foom debate at Jane Street Capital. Hanson’s summary of the debate and his arguments.
John Baez Interviews Eliezer Yudkowsky. (Note: You’ll have to scroll to the bottom to read the first part.) Although John finds Eliezer’s claims of AI dangers plausible, he isn’t persuaded to give up environmentalism in favor of working on FAI.
Bloggingheads: Yudkowsky and Aaronson talk about AI and Many-worlds. Other Bloggingheads discussions that have appeared on Less Wrong.
GiveWell.org interviews SIAI. (Some Singularity Institute folks say they were poorly represented in this interview.)
Several Arguments For and Against Superintelligence/”Singularity”.
Talk:Perspectives on intelligence explosion
I nominate this page for deletion. Reasons: does not (even pretend to) follow the style of other articles, is not an LW-centric concept or one introduced in blog posts, has no articles linking to it and does not link to other articles (I say this knowing the risk that it might happen, which would be bad), bizarre inclusionism of the dumbest of critiques. Grognor 20:42, 14 March 2012 (UTC)
Given that the page seems to have received some positive reception, I think it’s worth improving the page instead of deleting it. I’m willing to give up inclusionism to a certain extent. I realize that some of the “critiques” may not deserve to be there at all, but I think it’s worth linking to some critiques that are not quite up to the Less Wrong standard. I especially think we should link to the Charlie Stross and Ben Goertzel ones since they have been so widely read. I’m also open to renaming the page.
I’m not convinced of the advantages of limiting the scope of the wiki in general. For example, would it really do that much harm to have a wiki page linking to all of the advice threads for young people on what major to do and so on that of popped up over the years? I’d rather expand the scope of the wiki so that this sort of page is normal than delete this page so the scope stays constricted. Maybe we could engage the community at large with this issue?--John Maxwell IV 22:12, 14 March 2012 (UTC)
Retracting deletion nomination. I’d prefer the page to be kept, but not linked to until/unless it is improved significantly. - Grognor 20:13, 17 May 2012 (UTC)