Assuming by “it” you refer to the decision theory work, that UFAI is a threat, Many Worlds Interpretation, things they actually have endorsed in some fashion, it would be fair enough to talk about how the administrators have posted those things and described them as conclusions of the content, but it should accurately convey that that was the extent of “pushing” them. Written from a neutral point of view with the beliefs accurately represented, informing people that the community’s “leaders” have posted arguments for some unusual beliefs (which readers are entitled to judge as they wish) as part of the content would be perfectly reasonable.
It would also be reasonable to talk about the extent to which atheism is implicitly pushed in stronger fashion; theism is treated as assumed wrong in examples around the place, not constantly but to a much greater degree. I vaguely recall that the community has non-theists as a strong majority.
The problem is that this is simply not what the articles say. The articles imply strongly that the more unusual beliefs posted above are widely accepted- not that they are posted in the content but that they are believed by Less Wrong members, part of the identity of someone who is a Less Wrong user. This is simply wrong. And the difference is significant; it is incorrectly accusing all people interested in the works of a writer of being proponents of that writer’s most unusual beliefs, discussed only in a small portion of their total writings. And this should be fixed so they convey an accurate impression.
The Scientology comparison is misleading in that Scientology attempts to use cult practices to achieve homogeneity of beliefs, whereas Less Wrong does not- the poll solidly demonstrates that homogeneity of beliefs is not a thing which is happening. A better analogy would be a community of fans of the works of a philosopher who wrote a lot of stuff and came to some outlandish conclusions in parts, but the fans don’t largely believe that outlandish stuff. Yeah, their outlandish stuff is worth discussing- but presenting it as the belief of the community is wrong even if the philosopher alleges it all fits together. Having an accurate belief here matters, because it has greatly different consequences. There are major practical differences in how useful you’d expect the rest of the content to be, and how you’d perceive members of the community.
At present, much of the articles are written as “smear pieces” against Less Wrong’s community. As a clear and egregious example, it alleges they are “libertarian”, for example, clearly a shot at LW given RW’s readerbase, when surveys tell us that the most common political affiliation is “liberalism”, and while “libertarianism” is second, “socialism” is third. It does this while citing one of the surveys in the article itself. Many of the problems here are not subtle.
If by “it” you meant the evil AI from the future thing, it most certainly is not “the belief pushed by the organization running this place”; any reasonable definition of “pushing” something would have to meancommunicating it to people and attempting to convince them of it, and if anything they’re credibly trying to stop people from learning about it. There are no secret “higher levels” of Less Wrong content only shown to the “prepared”, no private venues conveying it to members as they become ready, so we can be fairly certain given publicly visible evidence that they aren’t communicating it or endorsing it as a belief to even ‘selected’ members.
It doesn’t obviously follow from anything posted on Less Wrong, it requires putting a whole bunch of parts together and assuming it is true.
Assuming by “it” you refer to the decision theory work, that UFAI is a threat, Many Worlds Interpretation, things they actually have endorsed in some fashion, it would be fair enough to talk about how the administrators have posted those things and described them as conclusions of the content, but it should accurately convey that that was the extent of “pushing” them. Written from a neutral point of view with the beliefs accurately represented, informing people that the community’s “leaders” have posted arguments for some unusual beliefs (which readers are entitled to judge as they wish) as part of the content would be perfectly reasonable.
It would also be reasonable to talk about the extent to which atheism is implicitly pushed in stronger fashion; theism is treated as assumed wrong in examples around the place, not constantly but to a much greater degree. I vaguely recall that the community has non-theists as a strong majority.
The problem is that this is simply not what the articles say. The articles imply strongly that the more unusual beliefs posted above are widely accepted- not that they are posted in the content but that they are believed by Less Wrong members, part of the identity of someone who is a Less Wrong user. This is simply wrong. And the difference is significant; it is incorrectly accusing all people interested in the works of a writer of being proponents of that writer’s most unusual beliefs, discussed only in a small portion of their total writings. And this should be fixed so they convey an accurate impression.
The Scientology comparison is misleading in that Scientology attempts to use cult practices to achieve homogeneity of beliefs, whereas Less Wrong does not- the poll solidly demonstrates that homogeneity of beliefs is not a thing which is happening. A better analogy would be a community of fans of the works of a philosopher who wrote a lot of stuff and came to some outlandish conclusions in parts, but the fans don’t largely believe that outlandish stuff. Yeah, their outlandish stuff is worth discussing- but presenting it as the belief of the community is wrong even if the philosopher alleges it all fits together. Having an accurate belief here matters, because it has greatly different consequences. There are major practical differences in how useful you’d expect the rest of the content to be, and how you’d perceive members of the community.
At present, much of the articles are written as “smear pieces” against Less Wrong’s community. As a clear and egregious example, it alleges they are “libertarian”, for example, clearly a shot at LW given RW’s readerbase, when surveys tell us that the most common political affiliation is “liberalism”, and while “libertarianism” is second, “socialism” is third. It does this while citing one of the surveys in the article itself. Many of the problems here are not subtle.
If by “it” you meant the evil AI from the future thing, it most certainly is not “the belief pushed by the organization running this place”; any reasonable definition of “pushing” something would have to meancommunicating it to people and attempting to convince them of it, and if anything they’re credibly trying to stop people from learning about it. There are no secret “higher levels” of Less Wrong content only shown to the “prepared”, no private venues conveying it to members as they become ready, so we can be fairly certain given publicly visible evidence that they aren’t communicating it or endorsing it as a belief to even ‘selected’ members.
It doesn’t obviously follow from anything posted on Less Wrong, it requires putting a whole bunch of parts together and assuming it is true.