While the concise summary clearly associates SI with Good’s intelligence explosion, nowhere does it specifically say anything about Kurzweil or accelerating change. If people really are getting confused about what sort of singularity you’re thinking about, would it be helpful as a temporary measure to put some kind of one-sentence disclaimer in the first couple paragraphs of the summary? I can understand that maybe this would only further the association between “singularity” and Kurzweil’s technology curves, but if you don’t want to lose the word entirely, it might help to at least make clear that the issue is in dispute.
Also, on a separate subject, I notice that the summary presently has a number of ”??” marks, presumably as a kind of formatting error. Just a heads-up. :)
While the concise summary clearly associates SI with Good’s intelligence explosion, nowhere does it specifically say anything about Kurzweil or accelerating change. If people really are getting confused about what sort of singularity you’re thinking about, would it be helpful as a temporary measure to put some kind of one-sentence disclaimer in the first couple paragraphs of the summary? I can understand that maybe this would only further the association between “singularity” and Kurzweil’s technology curves, but if you don’t want to lose the word entirely, it might help to at least make clear that the issue is in dispute.
Also, on a separate subject, I notice that the summary presently has a number of ”??” marks, presumably as a kind of formatting error. Just a heads-up. :)