I agree with you that the motivational bits, of wanting to acculturate to LW to be around the cool people, rely on the cool people being here.
The main reason I’m uncertain about the forum as the right model is that I don’t see it in many other educational contexts and I think there are weird dynamics around the asymmetry between questioners and answerers and levels of competence/experience. (The cool people want some, but not too much, interaction with not-yet-cool people.) Perhaps the Slack and IRC channels and similar venues deserve some more of my attention as potential solutions here.
Vaniver’s other suggestion for something that would serve this need better than a Redditalike is Stack Overflow. That’s a better fit, but the SO model works best where what people need is answers to specific questions that have clear-cut answers.
Agreed. This dynamic gets even worse when the problems are psychological. If someone goes to Stack Overflow and posts “hey, this code doesn’t do what I expect. What’s going wrong?” we can copy the code and run it on our machines and find the issue. If someone goes to Sanity Overflow and posts “hey, I’m akratic. What’s going wrong?” we… have a much harder time.
One of the things that comes up every now and then is the idea of rewriting the Sequences, and I think the main goal there would be to make them with as little of Eliezer’s personality shining through as possible. (I like his personality well enough, but it’s clear that many don’t, and a more communal central repository would reduce some of the idiosyncrasy concerns.)
Some think that the Sequences could be significantly shortened, but I suspect that’s optimism speaking instead of experience. There are only a handful of sections in the Sequences where Eliezer actually repeats himself, and even then it’s likely one of those places where, really, it’s worth giving them three slightly different versions of the same thing to make sure they get it.
For what it’s worth, I got relatively little[1] out of reading the Sequences solo, in any form (and RAZ is worse than LW in this regard, because the comments were worth something even on really old and inactive threads, and surprisingly many threads were still active when I first joined the site in 2014).
What really did the job for me was the reading group started by another then-Seattleite[2]. We started as a small group (I forget how many people the first meetings had, but it was a while before we broke 10 and longer before we did it regularly) that simply worked through the core sequences—Map & Territory, then How to Actually Change Your Mind—in order (as determined by posts on the sequences themselves at first, and later by the order of Rationality: AI to Zombies chapters). Each week, we’d read the next 4-6 posts (generally adjusted for length) and then meet for roughly 90 minutes to talk about them in groups of 4-8 (as more people started coming, we began splitting up for the discussions). Then we’d (mostly) all go to dinner together, at which we’d talk about anything—the reading topics, other Rationality-esque things, or anything else a group of smart mostly-20-somethings might chat about—and next week we’d do it again.
If there’s such a group near you, go to it! If not, try to get it started. Starting one of these groups is non-trivial. I was already considering the idea before I met the person who actually made it happen (and I met her through OKCupid, not LessWrong or the local rationality/EA community), but I wouldn’t have done it anywhere near as well as she did. On the other hand, maybe you have the skills and connections (she did) and just need the encouragement. Or maybe you know somebody else who has what it takes, and need to go encourage them.
[1] Reading the Sequences by myself, the concepts were very “slippery”; I might have technically remembered them, but I didn’t internalize them. If there was anything I disagreed with or that seemed unrealistic—and this wasn’t so very uncommon—it made me discount the whole post to effectively nothing. Even when something seemed totally, brilliantly true, it also felt untested to me, because I hadn’t talked about it with anybody. Going to the group fixed all of that. While it’s not really what you’re asking for, you may find it does the trick.
[2] She has since moved to (of course) the Bay Area. Nonetheless, the group continues (and is roughly now two years running, hitting nearly every Monday evening). We regularly break 20 attendees now, occasionally break 30, and the “get dinner together” follow-up has grown into a regularly-scheduled weekly event in its own right at one of the local rationalist houses.
Agreed that the above won’t work for all people, not even all people who say
I haven’t and probably can’t internalize it on a very deep, systematic level, no matter how many times I re-read the articles
Nonetheless, I find it a useful thing to consider, both because it’s a lot easier (even if there isn’t yet such a group in your area) than writing an entire LW-inspired rationality textbook, and because it’s something that a person can arrange without needing to have already internalized everything (which might be a prerequisite for the “write the textbook” approach). It also provides a lot of benefits that go well beyond solving the specific problem of internalizing the material (I have also discovered new material I would not have found as early if at all, I have engaged in discussions related to the readings that caused me to update other beliefs, I have formed a new social circle of people with whom I can discuss topics with in a manner that none of my other circles support, etc.).
I have read somewhere that all else being equal dialogues attract people’s attention better than monologues, at least on television. Perhaps in some cases some ideas (including old sequence posts, especially more controversial ones) could be presented as Socratic dialogues, o perhaps, if a post is being written collaboratively by more than one person, one could write a position and the others (or two) could ask inquisitive questions or try to find holes in his or her argument. You would think that having comments already covers that, and in a sense it is indeed similar to having two waves of comments. However, in this case, the post that is saw by most people has already covered at least a few objections and thus is of relatively higher quality. Secondly, this allows “debate” posts that do not present any clear conclusion and contain only arguments for different positions (where does the controversy lies is often an interesting and informative question). Thirdly, I conjecture that is psychologically more pleasant to be nitpicked by one or two people (who you already know they were explicitly asked to do that) than a lot of commenters at once. You could call this series “Dialogues concerning (human) rationality” or something like that.
Of course, not all posts should be written as dialogues (e.g. some more technical ones might be difficult to structure this way).
I agree with you that the motivational bits, of wanting to acculturate to LW to be around the cool people, rely on the cool people being here.
The main reason I’m uncertain about the forum as the right model is that I don’t see it in many other educational contexts and I think there are weird dynamics around the asymmetry between questioners and answerers and levels of competence/experience. (The cool people want some, but not too much, interaction with not-yet-cool people.) Perhaps the Slack and IRC channels and similar venues deserve some more of my attention as potential solutions here.
Agreed. This dynamic gets even worse when the problems are psychological. If someone goes to Stack Overflow and posts “hey, this code doesn’t do what I expect. What’s going wrong?” we can copy the code and run it on our machines and find the issue. If someone goes to Sanity Overflow and posts “hey, I’m akratic. What’s going wrong?” we… have a much harder time.
One of the things that comes up every now and then is the idea of rewriting the Sequences, and I think the main goal there would be to make them with as little of Eliezer’s personality shining through as possible. (I like his personality well enough, but it’s clear that many don’t, and a more communal central repository would reduce some of the idiosyncrasy concerns.)
Some think that the Sequences could be significantly shortened, but I suspect that’s optimism speaking instead of experience. There are only a handful of sections in the Sequences where Eliezer actually repeats himself, and even then it’s likely one of those places where, really, it’s worth giving them three slightly different versions of the same thing to make sure they get it.
For what it’s worth, I got relatively little[1] out of reading the Sequences solo, in any form (and RAZ is worse than LW in this regard, because the comments were worth something even on really old and inactive threads, and surprisingly many threads were still active when I first joined the site in 2014).
What really did the job for me was the reading group started by another then-Seattleite[2]. We started as a small group (I forget how many people the first meetings had, but it was a while before we broke 10 and longer before we did it regularly) that simply worked through the core sequences—Map & Territory, then How to Actually Change Your Mind—in order (as determined by posts on the sequences themselves at first, and later by the order of Rationality: AI to Zombies chapters). Each week, we’d read the next 4-6 posts (generally adjusted for length) and then meet for roughly 90 minutes to talk about them in groups of 4-8 (as more people started coming, we began splitting up for the discussions). Then we’d (mostly) all go to dinner together, at which we’d talk about anything—the reading topics, other Rationality-esque things, or anything else a group of smart mostly-20-somethings might chat about—and next week we’d do it again.
If there’s such a group near you, go to it! If not, try to get it started. Starting one of these groups is non-trivial. I was already considering the idea before I met the person who actually made it happen (and I met her through OKCupid, not LessWrong or the local rationality/EA community), but I wouldn’t have done it anywhere near as well as she did. On the other hand, maybe you have the skills and connections (she did) and just need the encouragement. Or maybe you know somebody else who has what it takes, and need to go encourage them.
[1] Reading the Sequences by myself, the concepts were very “slippery”; I might have technically remembered them, but I didn’t internalize them. If there was anything I disagreed with or that seemed unrealistic—and this wasn’t so very uncommon—it made me discount the whole post to effectively nothing. Even when something seemed totally, brilliantly true, it also felt untested to me, because I hadn’t talked about it with anybody. Going to the group fixed all of that. While it’s not really what you’re asking for, you may find it does the trick.
[2] She has since moved to (of course) the Bay Area. Nonetheless, the group continues (and is roughly now two years running, hitting nearly every Monday evening). We regularly break 20 attendees now, occasionally break 30, and the “get dinner together” follow-up has grown into a regularly-scheduled weekly event in its own right at one of the local rationalist houses.
Upvoted, but this seems to vary from person to person. You also forgot how italics and lists work here.
Gah, thank you, edited. Markdown is my nemesis.
Agreed that the above won’t work for all people, not even all people who say
Nonetheless, I find it a useful thing to consider, both because it’s a lot easier (even if there isn’t yet such a group in your area) than writing an entire LW-inspired rationality textbook, and because it’s something that a person can arrange without needing to have already internalized everything (which might be a prerequisite for the “write the textbook” approach). It also provides a lot of benefits that go well beyond solving the specific problem of internalizing the material (I have also discovered new material I would not have found as early if at all, I have engaged in discussions related to the readings that caused me to update other beliefs, I have formed a new social circle of people with whom I can discuss topics with in a manner that none of my other circles support, etc.).
I have read somewhere that all else being equal dialogues attract people’s attention better than monologues, at least on television. Perhaps in some cases some ideas (including old sequence posts, especially more controversial ones) could be presented as Socratic dialogues, o perhaps, if a post is being written collaboratively by more than one person, one could write a position and the others (or two) could ask inquisitive questions or try to find holes in his or her argument. You would think that having comments already covers that, and in a sense it is indeed similar to having two waves of comments. However, in this case, the post that is saw by most people has already covered at least a few objections and thus is of relatively higher quality. Secondly, this allows “debate” posts that do not present any clear conclusion and contain only arguments for different positions (where does the controversy lies is often an interesting and informative question). Thirdly, I conjecture that is psychologically more pleasant to be nitpicked by one or two people (who you already know they were explicitly asked to do that) than a lot of commenters at once. You could call this series “Dialogues concerning (human) rationality” or something like that.
Of course, not all posts should be written as dialogues (e.g. some more technical ones might be difficult to structure this way).
I suspect the main benefit from rewriting the Sequences would actually be that it would be an excuse to post useful stuff about rationality again.