The Curve of Capability makes only a little sense, IMO. The other articles are mostly about a tiny minority breaking away from the rest of civilization. That seems rather unrealistic to me too—but things become more plausible when we consider the possibility of large coalitions winning, or most of the planet.
I boldly claim that my criticisms are better than those of the Tech Luminaries:
Is an Intelligence Explosion a Disjunctive or Conjunctive Event?
Why an Intelligence Explosion might be a Low-Priority Global Risk (as an addition: No Basic AI Drives and my recent comment here)
Also see:
The Curve of Capability by rwallace
The Betterness Explosion by Robin Hanson
Is The City-ularity Near? by Robin Hanson
A summary of Robin Hanson’s positions by Robin Hanson
How far can AI jump? by Katja Grace
In light of the actual arguments collected on this page so far, I don’t think that’s such a bold claim.
I agree, as of the time of your comment, though I’m adding new ones as time passes.
Also see a summary of Robin Hanson’s positions here.
Regarding the “also see” material in the parent:
The Curve of Capability makes only a little sense, IMO. The other articles are mostly about a tiny minority breaking away from the rest of civilization. That seems rather unrealistic to me too—but things become more plausible when we consider the possibility of large coalitions winning, or most of the planet.