SIAI/FAI is explicitly NOT what Lesswrong is about and is often referred to as “the organization that must not be named” and things like that by more senior members.
Why I think this is not the case:
Eliezer Yudkowsky wrote: “I mean, it seems to me that where I think an LW post is important and interesting in proportion to how much it helps construct a Friendly AI, how much it gets people to participate in the human project...”
The Sequences have been written with the goal in mind of convincing people of the importance of taking risks from AI serious and therefore donate to the SIAI: ”...after a few years of beating my head against the wall trying to get other people involved, I realized that I really did have to go back to the beginning, start over, and explain all the basics that people needed to know before they could follow the advanced arguments. Saving the world via AI research simply can’t compete against the Society for Treating Rare Diseases in Cute Kittens unless your audience knows about things like scope insensitivity...” (Reference: An interview with Eliezer Yudkowsky).
You can find a logo with a link to the SIAI in the header and a logo and a link to LessWrong on the SIAI’s frontpage.
LessWrong is mentioned as an achievement of the SIAI (Quote: “Less Wrong is important to the Singularity Institute’s work towards a beneficial Singularity”).
A quote from the official SIAI homepage: “Less Wrong is [...] a key venue for SIAI recruitment”.
LessWrong is the mouthpiece of the SIAI and its main advertisement platform. I don’t think one can reasonably disagree about that.
Why I think this is not the case:
Eliezer Yudkowsky wrote: “I mean, it seems to me that where I think an LW post is important and interesting in proportion to how much it helps construct a Friendly AI, how much it gets people to participate in the human project...”
The Sequences have been written with the goal in mind of convincing people of the importance of taking risks from AI serious and therefore donate to the SIAI: ”...after a few years of beating my head against the wall trying to get other people involved, I realized that I really did have to go back to the beginning, start over, and explain all the basics that people needed to know before they could follow the advanced arguments. Saving the world via AI research simply can’t compete against the Society for Treating Rare Diseases in Cute Kittens unless your audience knows about things like scope insensitivity...” (Reference: An interview with Eliezer Yudkowsky).
LessWrong is used to ask for donations.
You can find a logo with a link to the SIAI in the header and a logo and a link to LessWrong on the SIAI’s frontpage.
LessWrong is mentioned as an achievement of the SIAI (Quote: “Less Wrong is important to the Singularity Institute’s work towards a beneficial Singularity”).
A quote from the official SIAI homepage: “Less Wrong is [...] a key venue for SIAI recruitment”.
LessWrong is the mouthpiece of the SIAI and its main advertisement platform. I don’t think one can reasonably disagree about that.
Thanks for helping me change my mind.
Thanks for changing your mind. You should edit your original comment to not confuse people skimming.
Done.