I should clarify. I did not mean to insult Eliezer—I think that he’s a well intentioned and very brilliant guy. I also was not attempting to advocate majoritarian epistemology. Also, I acknowledge that even if Eliezer is misguided about in his beliefs about his future, there are clearly other possible explanations besides “that other kind of status.”
To refine my question: When one adopts a view which
(a) Deviates from mainstream beliefs
(b) Is flattering to oneself
(c) Is comprehensive in scope and implications
one should be vigilant about the possibility that one is being influenced by desire for “that other kind of status.”
Eliezer’s views about the expected value of SIAI’s activities seem to meet each of criteria (a), (b) and (c) fairly strongly. This does not mean that his views are wrong, but it does make me reluctant to take them very seriously without evidence that he (and others who hold such views) have exhibited a high level of vigilance about being influenced for desire for “that other kind of status” in connection with these views.
Is there anywhere where I can find evidence that Eliezer and others who share his views have exhibited such a high level of vigilance toward possibility of being influenced by a desire for “that other kind of status” in connection with their views about the expected value of SIAI’s activities?
[Edited for formatting.]
Eliezer has said that “it seems pretty obvious to me that some point in the not-too-distant future we’re going to build an AI [...] it will be a superintelligence relative to us [...] in one to ten decades and probably on the lower side of that.” ---- http://bloggingheads.tv/diavlogs/21857
The vast majority of very smart and accomplished people (e.g. Nobel prize winners in sciences, Fields medalists, founders of large tech corporations) do not subscribe to the view that the “singularity is near.” This raises a strong possibility that people like Eliezer who think that it’s pretty obvious that “the singularity is near” are deluded for the same reason that the 9-11 Truthers are. As Yvain says, it’s a boost to one’s self esteem to feel that one has “figured out a deep and important secret that the rest of the world is too complacent to realize.”
Has there been any discussion of this matter in the Less Wrong archives?