You are worried that the SIAI name signals a lack of credibility. You should be worried about its people do. No, it’s not the usual complaints about Eliezer. I’m talking about Will Newsome, Stephen Omohundro, and Ben Goertzel.
Will Newsome has apparently gone off the deep end:
http://lesswrong.com/lw/ct8/this_post_is_for_sacrificing_my_credibility/6qjg
The typical practice in these cases, as I understand it, is to sweep these people under the rug and forget that they had anything to do with the organization. This might not be the most intellectually honest thing to do, but it’s more PR-minded than leaving them listed, and more polite than adding them to a hall of shame.
And, while the Singularity Institute is announcing that it is absolutely dangerous to build an AGI without proof of friendlyness, two of its advisors, Omohundro and Goertzel, are, separately, attempting to build AGIs. Of course, this is only what I have learned from http://singularity.org/advisors/ -- maybe they have since changed their minds?
And, while the Singularity Institute is announcing that it is absolutely dangerous to build an AGI without proof of friendlyness, two of its advisors, Omohundro and Goertzel, are, separately, attempting to build AGIs. Of course, this is only what I have learned from http://singularity.org/advisors/ -- maybe the have since changed their minds?
You are worried that the SIAI name signals a lack of credibility. You should be worried about its people do. No, it’s not the usual complaints about Eliezer. I’m talking about Will Newsome, Stephen Omohundro, and Ben Goertzel.
Will Newsome has apparently gone off the deep end: http://lesswrong.com/lw/ct8/this_post_is_for_sacrificing_my_credibility/6qjg The typical practice in these cases, as I understand it, is to sweep these people under the rug and forget that they had anything to do with the organization. This might not be the most intellectually honest thing to do, but it’s more PR-minded than leaving them listed, and more polite than adding them to a hall of shame.
And, while the Singularity Institute is announcing that it is absolutely dangerous to build an AGI without proof of friendlyness, two of its advisors, Omohundro and Goertzel, are, separately, attempting to build AGIs. Of course, this is only what I have learned from http://singularity.org/advisors/ -- maybe they have since changed their minds?
Goertzel is still there? I’m surprised.
And now there are three: http://singularityhub.com/2013/01/10/exclusive-interview-with-ray-kurzweil-on-future-ai-project-at-google/
Does Kurzweil have anything to do with the Singularity Institute? Because I don’t see him listed as a director or advisor on their site.
He was an adviser. But I see he no longer is. Retracted.