I were the one who suggested the board. I originally called it “rationality/decision theory/existential risk/FAI/etc.” but remarked it was to long. I agree lesswrong is not the best name for it. SIAI/FAI is explicitly NOT what Lesswrong is about and is often referred to as “the organization that must not be named” and things like that by more senior members.
SIAI/FAI is explicitly NOT what Lesswrong is about and is often referred to as “the organization that must not be named” and things like that by more senior members.
Why I think this is not the case:
Eliezer Yudkowsky wrote: “I mean, it seems to me that where I think an LW post is important and interesting in proportion to how much it helps construct a Friendly AI, how much it gets people to participate in the human project...”
The Sequences have been written with the goal in mind of convincing people of the importance of taking risks from AI serious and therefore donate to the SIAI: ”...after a few years of beating my head against the wall trying to get other people involved, I realized that I really did have to go back to the beginning, start over, and explain all the basics that people needed to know before they could follow the advanced arguments. Saving the world via AI research simply can’t compete against the Society for Treating Rare Diseases in Cute Kittens unless your audience knows about things like scope insensitivity...” (Reference: An interview with Eliezer Yudkowsky).
You can find a logo with a link to the SIAI in the header and a logo and a link to LessWrong on the SIAI’s frontpage.
LessWrong is mentioned as an achievement of the SIAI (Quote: “Less Wrong is important to the Singularity Institute’s work towards a beneficial Singularity”).
A quote from the official SIAI homepage: “Less Wrong is [...] a key venue for SIAI recruitment”.
LessWrong is the mouthpiece of the SIAI and its main advertisement platform. I don’t think one can reasonably disagree about that.
SIAI/FAI is explicitly NOT what Lesswrong is about and is often referred to as “the organization that must not be named” and things like that by more senior members.
I am not sure about this—do you have sources? Especially so since SIAI logo is on every page (top right corner)
You have probably neglected to notice the part of that wiki page where it says, “until the end of April 2009.” I have not noticed any significant opposition to discussion of AGI and the singularity after April 2009.
I thought that was how long it was FORBIDDEN, but that even if it was relaxed after that the point of the ban was to establish a more soft but permanent norm of not talking about it unnecessarily.
No, the ban was always intended to be a moratorium for the first few months. There is no social taboo about talking about AGI here now. It is just like any other scientific topic. Not directly the topic of the site but with a couple of connections which will crop up whenever it is natural.
Interesting, but it seems that this intention has been overruled by forum participants. While there are no posts addressing AGI mechanizms directly, there are many about FAI, Singularity and decision theory.
Are there any by official SIAI people? I know such posts exist but my impression was they were slightly frowned upon and supposed to be kept at a minimum.
SIAI is not official “owners” of the site; it’s community-managed. The closest thing to official is a site admin. Nesov has certainly posted on AI related topics.
What I mean it that they do not chose to set the policy for the site directly. This task is delegated to the admins, most of whom are not SIAI employees.
Redacted because I were stupid and wrong!
Original text:
Why I think this is not the case:
Eliezer Yudkowsky wrote: “I mean, it seems to me that where I think an LW post is important and interesting in proportion to how much it helps construct a Friendly AI, how much it gets people to participate in the human project...”
The Sequences have been written with the goal in mind of convincing people of the importance of taking risks from AI serious and therefore donate to the SIAI: ”...after a few years of beating my head against the wall trying to get other people involved, I realized that I really did have to go back to the beginning, start over, and explain all the basics that people needed to know before they could follow the advanced arguments. Saving the world via AI research simply can’t compete against the Society for Treating Rare Diseases in Cute Kittens unless your audience knows about things like scope insensitivity...” (Reference: An interview with Eliezer Yudkowsky).
LessWrong is used to ask for donations.
You can find a logo with a link to the SIAI in the header and a logo and a link to LessWrong on the SIAI’s frontpage.
LessWrong is mentioned as an achievement of the SIAI (Quote: “Less Wrong is important to the Singularity Institute’s work towards a beneficial Singularity”).
A quote from the official SIAI homepage: “Less Wrong is [...] a key venue for SIAI recruitment”.
LessWrong is the mouthpiece of the SIAI and its main advertisement platform. I don’t think one can reasonably disagree about that.
Thanks for helping me change my mind.
Thanks for changing your mind. You should edit your original comment to not confuse people skimming.
Done.
For the benefit of any drive by future readers who are wishing to pick up the cultural norms here: The above is simply false.
I am not sure about this—do you have sources? Especially so since SIAI logo is on every page (top right corner)
This was done to keep the sequences on the topic of rationality rather than AI theory/singularitarianism. It isn’t really used anymore.
Ah. And my intuitions about LessWrong norms are primarily shaped by the sequences. That explains a bit.
http://wiki.lesswrong.com/wiki/Topic_that_must_not_be_named
You have probably neglected to notice the part of that wiki page where it says, “until the end of April 2009.” I have not noticed any significant opposition to discussion of AGI and the singularity after April 2009.
I thought that was how long it was FORBIDDEN, but that even if it was relaxed after that the point of the ban was to establish a more soft but permanent norm of not talking about it unnecessarily.
No, the ban was always intended to be a moratorium for the first few months. There is no social taboo about talking about AGI here now. It is just like any other scientific topic. Not directly the topic of the site but with a couple of connections which will crop up whenever it is natural.
Interesting, but it seems that this intention has been overruled by forum participants. While there are no posts addressing AGI mechanizms directly, there are many about FAI, Singularity and decision theory.
Are there any by official SIAI people? I know such posts exist but my impression was they were slightly frowned upon and supposed to be kept at a minimum.
SIAI is not official “owners” of the site; it’s community-managed. The closest thing to official is a site admin. Nesov has certainly posted on AI related topics.
I think SIAI are, in fact, at least partial owners of the site—if not, why is their logo on the header?
I think you’re right, with emphasis on the “at least”.
What I mean it that they do not chose to set the policy for the site directly. This task is delegated to the admins, most of whom are not SIAI employees.
Ok, I was not aware of Nesov having posted SIAI stuff. Yea, that certainly undermines my position a bit.