What’s up with the word “foom”, and why is it always in all caps? Can we come up with another name for this that doesn’t sound like a sci-fi nerd in need of Ritalin?
Yeah I agree. “Intelligence explosion” is bandied about, but I guess that can also refer to Kurzweilian-style exponential growth phenomena.
“Hard take-off singularity” is close, too, but not exactly the same. Again, it refers to a certain magnitude of acceleration, whereas FOOM refers specifically to recursive self-improvement as the mechanism.
My $0.02: singularities brought about by recursive self-improvement are one concept, and singularities involving really-really-fast improvement are a different concept. (They are, of course, perfectly compatible.)
It may just not be all that useful to have a single word that denotes both.
If I want to talk about a “hard take-off” or a “step-function” scenario caused by recursively self-improving intelligence, I can say that.
But I estimate that 90% of what I will want to say about it will be true of many different step-function scenarios (e.g., those caused by the discovery of a cache of Ancient technology) or true of many different recursively self-improving intelligence scenarios.
So it may be worthwhile to actually have to stop and think about whether I want to include both clauses.
However, It does seem that we talk about “hard take-off scenario caused by recursively self-improving intelligence” often enough to warrant a convenience term to mean just that. Much of the discussion about cascades, cycles, insights, AI-boxes, resource overhangs etc are specific to the recursive self-improvement scenario, and not to, e.g. the cache of Ancient tech scenario.
What’s up with the word “foom”, and why is it always in all caps? Can we come up with another name for this that doesn’t sound like a sci-fi nerd in need of Ritalin?
Yeah I agree. “Intelligence explosion” is bandied about, but I guess that can also refer to Kurzweilian-style exponential growth phenomena.
“Hard take-off singularity” is close, too, but not exactly the same. Again, it refers to a certain magnitude of acceleration, whereas FOOM refers specifically to recursive self-improvement as the mechanism.
I’m open to suggestions.
My $0.02: singularities brought about by recursive self-improvement are one concept, and singularities involving really-really-fast improvement are a different concept. (They are, of course, perfectly compatible.)
It may just not be all that useful to have a single word that denotes both.
If I want to talk about a “hard take-off” or a “step-function” scenario caused by recursively self-improving intelligence, I can say that.
But I estimate that 90% of what I will want to say about it will be true of many different step-function scenarios (e.g., those caused by the discovery of a cache of Ancient technology) or true of many different recursively self-improving intelligence scenarios.
So it may be worthwhile to actually have to stop and think about whether I want to include both clauses.
Completely agree with paras 1 and 2.
However, It does seem that we talk about “hard take-off scenario caused by recursively self-improving intelligence” often enough to warrant a convenience term to mean just that. Much of the discussion about cascades, cycles, insights, AI-boxes, resource overhangs etc are specific to the recursive self-improvement scenario, and not to, e.g. the cache of Ancient tech scenario.
See http://lesswrong.com/lw/we/recursive_selfimprovement/ for an attempt at a definition.