Honestly, I don’t think Eliezer would look overly eccentric if it weren’t for LessWrong/Overcomingbias. Comp sci is notoriously eccentric, AI research possibly more so. The stigma against Eliezer isn’t from his ideas, it isn’t from his self confidence, it’s from his following.
Kurzweil is a more dulled case: he has good ideas, but is clearly sensational, he has a large following, but that following isn’t nearly as dedicated as the one to Eliezer (not necessarily to Eliezer himself, but to his writings and the “practicing of rationality”). And the effect? I have a visceral distaste whenever I hear someone from the Kurzweil camp say something pro-singularity. It’s very easy for me to imagine that if I didn’t already put stock in the notion of a singularity, that hearing a Kurzweilian talk would bias me against the idea.
Nonetheless, it may very well be the case that Kurzweil has done a net good to the singularity meme (and perhaps net harm to existential risk), spreading the idea wide and far, even while generating negative responses. Is the case with Eliezer the same? I don’t know. My gut says no. Taking existential risk seriously is a much harder meme to catch than believing in a dumbed down version of the singularity.
My intuition is that Eliezer by himself, although abrasive in presentation, isn’t turning people off by his self confidence and grandioseness. On the contrary, I—and I suspect many—love to argue with intelligent people with strong beliefs. In this sense, Eliezer’s self assurance is a good bait. On the other hand, when someone with inferior debating skills goes around spurting off the message of someone else, that, to me, is purely repulsive: I have no desire to talk with those people. They’re the people spouting off Aether nonsense on physics forums. There’s no status to be won, even on the slim chance of victory.
Finally, aside from Eliezer as himself and Eliezer through the proxy of others, there’s also Eliezer as a figurehead of SIAI. Here things are different as well, and Eliezer is again no longer merely himself. He speaks for an organisation, and, culturally, we expect serious organisations to temper their outlandish claims. Take cancer research: presumably all researchers want to cure cancer. Presumably at least some of them are optimistic and believe we actually will. But we rarely hear this, and we never hear it from organizations.
I think SIAI, and Eliezer in his capacity as a figure head, probably should temper their claims as well. The idea of existential risks from AI is already pervasive. Hollywood took care of that. What remains is a battle of credibility.
(Unfortunately, I really don’t know how to go about tempering claims with the previous claims already on permanent record. But maybe this isn’t as important as I think it is.)
Honestly, I don’t think Eliezer would look overly eccentric if it weren’t for LessWrong/Overcomingbias. Comp sci is notoriously eccentric, AI research possibly more so. The stigma against Eliezer isn’t from his ideas, it isn’t from his self confidence, it’s from his following.
Would you include SL4 there too? I think there were discussions there years ago (well before OB, and possibly before Kurzweil’s overloaded Singularity meme complex became popular) about the perception of SIAI/Singularitarianism as a cult. (I wasn’t around for any such discussions, but I’ve poked around in the archives from time to time. Here is one example.)
Honestly, I don’t think Eliezer would look overly eccentric if it weren’t for LessWrong/Overcomingbias. Comp sci is notoriously eccentric, AI research possibly more so. The stigma against Eliezer isn’t from his ideas, it isn’t from his self confidence, it’s from his following.
Kurzweil is a more dulled case: he has good ideas, but is clearly sensational, he has a large following, but that following isn’t nearly as dedicated as the one to Eliezer (not necessarily to Eliezer himself, but to his writings and the “practicing of rationality”). And the effect? I have a visceral distaste whenever I hear someone from the Kurzweil camp say something pro-singularity. It’s very easy for me to imagine that if I didn’t already put stock in the notion of a singularity, that hearing a Kurzweilian talk would bias me against the idea.
Nonetheless, it may very well be the case that Kurzweil has done a net good to the singularity meme (and perhaps net harm to existential risk), spreading the idea wide and far, even while generating negative responses. Is the case with Eliezer the same? I don’t know. My gut says no. Taking existential risk seriously is a much harder meme to catch than believing in a dumbed down version of the singularity.
My intuition is that Eliezer by himself, although abrasive in presentation, isn’t turning people off by his self confidence and grandioseness. On the contrary, I—and I suspect many—love to argue with intelligent people with strong beliefs. In this sense, Eliezer’s self assurance is a good bait. On the other hand, when someone with inferior debating skills goes around spurting off the message of someone else, that, to me, is purely repulsive: I have no desire to talk with those people. They’re the people spouting off Aether nonsense on physics forums. There’s no status to be won, even on the slim chance of victory.
Finally, aside from Eliezer as himself and Eliezer through the proxy of others, there’s also Eliezer as a figurehead of SIAI. Here things are different as well, and Eliezer is again no longer merely himself. He speaks for an organisation, and, culturally, we expect serious organisations to temper their outlandish claims. Take cancer research: presumably all researchers want to cure cancer. Presumably at least some of them are optimistic and believe we actually will. But we rarely hear this, and we never hear it from organizations.
I think SIAI, and Eliezer in his capacity as a figure head, probably should temper their claims as well. The idea of existential risks from AI is already pervasive. Hollywood took care of that. What remains is a battle of credibility.
(Unfortunately, I really don’t know how to go about tempering claims with the previous claims already on permanent record. But maybe this isn’t as important as I think it is.)
Would you include SL4 there too? I think there were discussions there years ago (well before OB, and possibly before Kurzweil’s overloaded Singularity meme complex became popular) about the perception of SIAI/Singularitarianism as a cult. (I wasn’t around for any such discussions, but I’ve poked around in the archives from time to time. Here is one example.)