give the SIAI staff an opportunity to respond to the points which I raise in the present post as well as my two posts titled Existential Risk and Public Relations and Other Existential Risks.
Indeed, given how busy everyone at SIAI has been with the Summit and the academic workshop following it, it is not surprising that there has not been much response from SIAI. I was only involved as an attendee of the Summit, and even I am only now able to find time to sit down and write something in response. At any rate, as a donor and former visiting fellow, I am only loosely affiliated with SIAI, and my comments here are solely my own, although my thoughts are certainly influenced by observations of the organization and conversation with those at SIAI. I don’t have the time/knowledge to address everything in your posts, but I wanted to say a couple of things.
I don’t disagree with you that SIAI has certain public relations problems. (Frankly, I doubt anyone at SIAI would disagree with that.) There is a lot of attention and discussion at SIAI about how to best spread knowledge about existential risks and to avoid sounding like a fringe/doomsday organization in doing so. It’s true that SIAI does consider the development of a general artificial intelligence to be the most serious existential risk facing humanity. But at least from what I have seen, much of SIAI’s current approach is to seed awareness of various existential risks among audiences that are in a position to effectively join the work in decreasing that risk.
Unfortunately, gaining recognition of existential risk is a hugely difficult task. Recent books from leading intellectuals on these issues (Sir Martin Rees’s Our Final Hour and Judge Richard Posner’s Catastrophe) don’t seem to have had very much apparent impact, and their ability to influence the general public is much greater than SIAI’s. But through the Summit and various publications, awareness does seem to be gradually increasing, including among important academics like David Chalmers.
Finally, I wanted to address one particular public relations problem, or at least, public relations issue, that is evident from your criticism so far – that is, there is an (understandable) perception that many observers have that SIAI and Eliezer are essentially synonymous. In the past, this perception may have been largely accurate. I don’t think that it currently holds true, but it definitely continues to persist in many people’s minds.
Given this perception, your primary focus on Eliezer to the exclusion of the other work that SIAI does is understandable. Nor, of course, could anyone possibly deny that Eliezer is an important part of SIAI, as its founder, board member, and prominent researcher. But there other SIAI officers, board members, researchers, and volunteers, and there is other work that SIAI is trying to do. The Summit is probably the most notable example of this. SIAI-affiliated people are also working on spreading knowledge of existential risks and the need to face them in academia and more broadly. The evolution of SIAI into an organization not focused solely on EY and his research is still a work in progress; and the rebranding of the organization as such in the minds of the public has not necessarily kept pace with even that gradual progress.
As for EY having delusions of grandeur, I want to address that, although only briefly, because EY is obviously in a much better position to address any of that if he chooses to. My understanding of the video you linked to in your previous post is that EY is commenting on both 1) his ability to work on FAI research and 2) his desire to work on that research. No matter how high EY’s opinion of his ability, and it doubtless is very high, it seems to me that I have seen comments from him recognizing that there are others with equally high (or even higher) ability, e.g., The Level Above Mine. I have no doubt EY would agree that the pool of those with the requisite ability is very limited. But the even greater obstacle to someone carrying on EY’s work is the combination of that rare ability with the also rare desire to do that research and make it one’s life work. And I think that’s why EY answered the way he did. Indeed, the reference to Michael Vassar, it seems to me, primarily makes sense in terms of the desire axis, since Michael Vassar’s expertise is not in developing FAI himself, although he has other great qualities in terms of SIAI’s current mission of spreading existential risk awareness, etc.
I don’t disagree with you that SIAI has certain public relations problems.
Speaking from personal experience, the SIAI’s somewhat haphazard response to people answering its outreach calls strikes me as a bigger PR problem than Eliezer’s personality. The SIAI strikes me as in general not very good at effective collective action (possibly because that’s an area where Eliezer’s strengths are, as he admits himself, underdeveloped). One thing I’d suggest to correct that is to massively encourage collaborative posts on LW.
Agreed. I think that communication and coordination with many allies and supporters has historically been a weak point for SIAI, due to various reasons including overcommitment of some of those tasked with communications, failure to task anyone with developing or maintaining certain new and ongoing relationships, interpersonal skills being among the less developed skill sets among those at SIAI, and the general growing pains of the organization. My impression is that there has been some improvement in this area recently, but there’s still room for a lot more.
More collaborative posts on LW would be great to see. There have also been various discussions about workshops or review procedures for top-level posts that seem to have generated at least some interest. Maybe those discussions should just continue in the open thread or maybe it would be appropriate to have a top-level post where people could be invited to volunteer or could find others interested in collaboration, workshops, or the like.
Indeed, given how busy everyone at SIAI has been with the Summit and the academic workshop following it, it is not surprising that there has not been much response from SIAI. I was only involved as an attendee of the Summit, and even I am only now able to find time to sit down and write something in response. At any rate, as a donor and former visiting fellow, I am only loosely affiliated with SIAI, and my comments here are solely my own, although my thoughts are certainly influenced by observations of the organization and conversation with those at SIAI. I don’t have the time/knowledge to address everything in your posts, but I wanted to say a couple of things.
I don’t disagree with you that SIAI has certain public relations problems. (Frankly, I doubt anyone at SIAI would disagree with that.) There is a lot of attention and discussion at SIAI about how to best spread knowledge about existential risks and to avoid sounding like a fringe/doomsday organization in doing so. It’s true that SIAI does consider the development of a general artificial intelligence to be the most serious existential risk facing humanity. But at least from what I have seen, much of SIAI’s current approach is to seed awareness of various existential risks among audiences that are in a position to effectively join the work in decreasing that risk.
Unfortunately, gaining recognition of existential risk is a hugely difficult task. Recent books from leading intellectuals on these issues (Sir Martin Rees’s Our Final Hour and Judge Richard Posner’s Catastrophe) don’t seem to have had very much apparent impact, and their ability to influence the general public is much greater than SIAI’s. But through the Summit and various publications, awareness does seem to be gradually increasing, including among important academics like David Chalmers.
Finally, I wanted to address one particular public relations problem, or at least, public relations issue, that is evident from your criticism so far – that is, there is an (understandable) perception that many observers have that SIAI and Eliezer are essentially synonymous. In the past, this perception may have been largely accurate. I don’t think that it currently holds true, but it definitely continues to persist in many people’s minds.
Given this perception, your primary focus on Eliezer to the exclusion of the other work that SIAI does is understandable. Nor, of course, could anyone possibly deny that Eliezer is an important part of SIAI, as its founder, board member, and prominent researcher. But there other SIAI officers, board members, researchers, and volunteers, and there is other work that SIAI is trying to do. The Summit is probably the most notable example of this. SIAI-affiliated people are also working on spreading knowledge of existential risks and the need to face them in academia and more broadly. The evolution of SIAI into an organization not focused solely on EY and his research is still a work in progress; and the rebranding of the organization as such in the minds of the public has not necessarily kept pace with even that gradual progress.
As for EY having delusions of grandeur, I want to address that, although only briefly, because EY is obviously in a much better position to address any of that if he chooses to. My understanding of the video you linked to in your previous post is that EY is commenting on both 1) his ability to work on FAI research and 2) his desire to work on that research. No matter how high EY’s opinion of his ability, and it doubtless is very high, it seems to me that I have seen comments from him recognizing that there are others with equally high (or even higher) ability, e.g., The Level Above Mine. I have no doubt EY would agree that the pool of those with the requisite ability is very limited. But the even greater obstacle to someone carrying on EY’s work is the combination of that rare ability with the also rare desire to do that research and make it one’s life work. And I think that’s why EY answered the way he did. Indeed, the reference to Michael Vassar, it seems to me, primarily makes sense in terms of the desire axis, since Michael Vassar’s expertise is not in developing FAI himself, although he has other great qualities in terms of SIAI’s current mission of spreading existential risk awareness, etc.
Speaking from personal experience, the SIAI’s somewhat haphazard response to people answering its outreach calls strikes me as a bigger PR problem than Eliezer’s personality. The SIAI strikes me as in general not very good at effective collective action (possibly because that’s an area where Eliezer’s strengths are, as he admits himself, underdeveloped). One thing I’d suggest to correct that is to massively encourage collaborative posts on LW.
Agreed. I think that communication and coordination with many allies and supporters has historically been a weak point for SIAI, due to various reasons including overcommitment of some of those tasked with communications, failure to task anyone with developing or maintaining certain new and ongoing relationships, interpersonal skills being among the less developed skill sets among those at SIAI, and the general growing pains of the organization. My impression is that there has been some improvement in this area recently, but there’s still room for a lot more.
More collaborative posts on LW would be great to see. There have also been various discussions about workshops or review procedures for top-level posts that seem to have generated at least some interest. Maybe those discussions should just continue in the open thread or maybe it would be appropriate to have a top-level post where people could be invited to volunteer or could find others interested in collaboration, workshops, or the like.
Thanks for pointing out “The Level Above Mine.” I had not seen it before.