Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
I only discovered LW about a week ago, and I got the “cult” impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.
For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.
AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.
There’s also the problem of having a clearly defined leader, with strong controversial opinions which are treated like gospel. I would expect a community which discusses rationality to be more of an open debate/discussion between peers without any philosophical leaders that everybody agrees with. I don’t see any easy solution here, because Eliezer Yudkowsky’s reputation here is well earned- he actually is exceptionally brilliant and rational.
I would also like to see more articles on how to avoid bias, and apply bayesian methods to immediate present day problems and decision making. How can we avoid bias and correctly interpret data from scientific experiments, and then apply this knowledge to make good choices about things such as improving our own health?
Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.
Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).
Upvoted for sounding a lot like the kinds of complaints I’ve heard people say about LW and SIAI.
There is a large barrier to entry here, and if we want to win more, we can’t just blame people for not understanding the message. I’ve been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:
We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren’t exactly signalling openness or friendliness.
We have a tendency to preach to the converted. Or as the friend said:
It’s that classic mistake of talking in a way where you’re convincing or explaining something to yourself or the well-initiated instead of laying out the roadwork for foreigners.
He brought up an example for how material might be introduced to newly exposed folk.
If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it’s clear you can explain complicated material from near-scratch.
The curse of knowledge can be overcome, but it takes desire and some finesse.
If we intend to win the hearts and minds of the people (or at least make a mark in the greater world), we might want to work on evocative imagery that isn’t immediately cool to futurists and technophiles and sci-fi geeks. Sure, keep the awesome stuff we have, but maybe look for metaphors that work in other domains. In my mind, ideally, we should build a database of ideas and their parallels in other fields (using some degree of field work to actually find the words that work). Eliezer has done some great work this way, like with HP:MoR, and some of his short stories. Maybe the SIAI could shell out money to fund focus groups and interviews a la Luntz, who in my mind is a great Dark Side example of winning.
Edit for formatting and to mention that outreach and not seeming culty seem to be intertwined in a weird way. It is obvious to me that being The Esoteric Order Of LessWrong doesn’t do the world any favors (or us, for that matter), but that by working on outreach, we can be accused of proselytizing. I think it comes down to doing what works without doing the death spiral stuff. And it seems to me that no matter what is done, detractors are going to detract.
If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it’s clear you can explain complicated material from near-scratch.
That’s an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary—when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I’d heard in the course of a lot of years of listening to NPR.
I’m not saying it’s so magical that it can’t be equaled, I’m saying that it might be worth studying.
I only discovered LW about a week ago, and I got the “cult” impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.
For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.
AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.
There’s also the problem of having a clearly defined leader, with strong controversial opinions which are treated like gospel. I would expect a community which discusses rationality to be more of an open debate/discussion between peers without any philosophical leaders that everybody agrees with. I don’t see any easy solution here, because Eliezer Yudkowsky’s reputation here is well earned- he actually is exceptionally brilliant and rational.
I would also like to see more articles on how to avoid bias, and apply bayesian methods to immediate present day problems and decision making. How can we avoid bias and correctly interpret data from scientific experiments, and then apply this knowledge to make good choices about things such as improving our own health?
Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.
Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).
Upvoted for sounding a lot like the kinds of complaints I’ve heard people say about LW and SIAI.
There is a large barrier to entry here, and if we want to win more, we can’t just blame people for not understanding the message. I’ve been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:
We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren’t exactly signalling openness or friendliness.
We have a tendency to preach to the converted. Or as the friend said:
He brought up an example for how material might be introduced to newly exposed folk.
The curse of knowledge can be overcome, but it takes desire and some finesse.
If we intend to win the hearts and minds of the people (or at least make a mark in the greater world), we might want to work on evocative imagery that isn’t immediately cool to futurists and technophiles and sci-fi geeks. Sure, keep the awesome stuff we have, but maybe look for metaphors that work in other domains. In my mind, ideally, we should build a database of ideas and their parallels in other fields (using some degree of field work to actually find the words that work). Eliezer has done some great work this way, like with HP:MoR, and some of his short stories. Maybe the SIAI could shell out money to fund focus groups and interviews a la Luntz, who in my mind is a great Dark Side example of winning.
Edit for formatting and to mention that outreach and not seeming culty seem to be intertwined in a weird way. It is obvious to me that being The Esoteric Order Of LessWrong doesn’t do the world any favors (or us, for that matter), but that by working on outreach, we can be accused of proselytizing. I think it comes down to doing what works without doing the death spiral stuff. And it seems to me that no matter what is done, detractors are going to detract.
That’s an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary—when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I’d heard in the course of a lot of years of listening to NPR.
I’m not saying it’s so magical that it can’t be equaled, I’m saying that it might be worth studying.