Once, a smart potential supporter stumbled upon the Singularity Institute’s (old) website and wanted to know if our mission was something to care about. So he sent our concise summary to an AI researcher and asked if we were serious. The AI researcher saw the word ‘Singularity’ and, apparently without reading our concise summary, sent back a critique of Ray Kurzweil’s “accelerating change” technology curves. (Even though SI researchers tend to be Moore’s Law agnostics, and our concise summary says nothing about accelerating change.)
For what it’s worth, my instinct would be to send back a message (if I had the opportunity) saying, “Yes, I agree completely; I don’t believe that Kurzweil’s accelerating change argument has merit. In fact, I believe that most Singularity Institute researchers feel the same way. If you’d like to hear an argument in favor of FAI that does have merit, I’d suggest reading such-and-such.”
That misses the point that SIAI only gets the chance to respond in such a way if the potential supporter actually contacts them and tells them the story. It makes you wonder how many potential supporters they never heard from because the supporter themself or someone the supporter asked for advice rejected a misunderstanding of what SIAI is about.
For what it’s worth, my instinct would be to send back a message (if I had the opportunity) saying, “Yes, I agree completely; I don’t believe that Kurzweil’s accelerating change argument has merit. In fact, I believe that most Singularity Institute researchers feel the same way. If you’d like to hear an argument in favor of FAI that does have merit, I’d suggest reading such-and-such.”
That misses the point that SIAI only gets the chance to respond in such a way if the potential supporter actually contacts them and tells them the story. It makes you wonder how many potential supporters they never heard from because the supporter themself or someone the supporter asked for advice rejected a misunderstanding of what SIAI is about.