It felt a little abstract and difficult to understand to me. For example, “systematically arrives at true beliefs and good decisions” in the first paragraph. Not that it’s easy or that I have a good idea myself, but I think it can be improved if it were explained more plainly and with some helpful examples.
I like the idea of talking about our feelings on well kept gardens. Probably towards the beginning. I think people would empathize with it, respect it, and be more willing to invest the time to onboard.
Relatedly, this onboarding takes a very long time. The Sequences are super long. Not that you necessarily have to read them all, but still. There’s just so much. I think that is something that we should be upfront about.
I really like this Shit Rationalists Say video. I think it captures an impressive amount things about what rationalists are like in a quick, fun, and entertaining way. It may seem like a toy, but I suspect that it’d be very useful for newcomers. (Maybe worth mentioning that it’s hyperbolic.)
It seems worth pointing to the FAQ. I’d imagine that new users would wonder about various things that are addressed in the FAQ. Although maybe it’s enough that the FAQ is discoverable in the side navigation.
Userresearch is always important, of course. Getting feedback in the comments section of this post is one thing but it’s also important to see how people who are actually new to LessWrong react to this.
Maybe it’d be good to mention that we have meetups in X number of cities across the world. I feel like the community aspect was undersold. Although I’m not sure how relevant that is to a new user. It seems nice to know, but I’m not sure.
To me it seems like a good idea to call out that we believe in a bunch of things that most people think are wacky. Intelligence explosion, cryonics, transhumanism, polyamory, circling. Better to filter out people who react strongly against those sorts of things from the get-go if you ask me.
To me it seems like a good idea to call out that we believe in a bunch of things that most people think are wacky. Intelligence explosion, cryonics, transhumanism, polyamory, circling. Better to filter out people who react strongly against those sorts of things from the get-go if you ask me.
I think it’s good to point out that the LW audience is far more contrarian than the median, and that arguments from conformity or authority or public relations or the absurdity heuristic aren’t received well. That said, I would not want to imply that there’s a belief litmus test, and also expect that a significant fraction of LW members don’t agree with / endorse / believe in at least one of these examples.
Agreed. However, I think you can sort of have your cake and eat it too here. I think you can:
Say that a lot of us believe things that most others see as wacky.
Give examples of those things.
Be clear that a significant number of people on LW don’t believe in a lot of that stuff.
Be clear that belief in that stuff isn’t expected from new members, or expected that you have to eventually reach agreement.
I think 4 is a really good point though and it didn’t occur to me when I wrote my initial comment, so thanks for pointing that out. At the same time, I do still endorse the “filter out people who react strongly against it” part. If 1, 2, 3 and 4 are all made clear and someone still, seeing that there’s a lot of belief in wacky ideas is turned off, I expect that they wouldn’t have been a good fit for the community anyway and so it’s better to “fail fast”.
Notes:
It felt a little abstract and difficult to understand to me. For example, “systematically arrives at true beliefs and good decisions” in the first paragraph. Not that it’s easy or that I have a good idea myself, but I think it can be improved if it were explained more plainly and with some helpful examples.
I like the idea of talking about our feelings on well kept gardens. Probably towards the beginning. I think people would empathize with it, respect it, and be more willing to invest the time to onboard.
Relatedly, this onboarding takes a very long time. The Sequences are super long. Not that you necessarily have to read them all, but still. There’s just so much. I think that is something that we should be upfront about.
I really like this Shit Rationalists Say video. I think it captures an impressive amount things about what rationalists are like in a quick, fun, and entertaining way. It may seem like a toy, but I suspect that it’d be very useful for newcomers. (Maybe worth mentioning that it’s hyperbolic.)
It seems worth pointing to the FAQ. I’d imagine that new users would wonder about various things that are addressed in the FAQ. Although maybe it’s enough that the FAQ is discoverable in the side navigation.
User research is always important, of course. Getting feedback in the comments section of this post is one thing but it’s also important to see how people who are actually new to LessWrong react to this.
Maybe it’d be good to mention that we have meetups in X number of cities across the world. I feel like the community aspect was undersold. Although I’m not sure how relevant that is to a new user. It seems nice to know, but I’m not sure.
To me it seems like a good idea to call out that we believe in a bunch of things that most people think are wacky. Intelligence explosion, cryonics, transhumanism, polyamory, circling. Better to filter out people who react strongly against those sorts of things from the get-go if you ask me.
HPMoR!
I think it’s good to point out that the LW audience is far more contrarian than the median, and that arguments from conformity or authority or public relations or the absurdity heuristic aren’t received well. That said, I would not want to imply that there’s a belief litmus test, and also expect that a significant fraction of LW members don’t agree with / endorse / believe in at least one of these examples.
Agreed. However, I think you can sort of have your cake and eat it too here. I think you can:
Say that a lot of us believe things that most others see as wacky.
Give examples of those things.
Be clear that a significant number of people on LW don’t believe in a lot of that stuff.
Be clear that belief in that stuff isn’t expected from new members, or expected that you have to eventually reach agreement.
I think 4 is a really good point though and it didn’t occur to me when I wrote my initial comment, so thanks for pointing that out. At the same time, I do still endorse the “filter out people who react strongly against it” part. If 1, 2, 3 and 4 are all made clear and someone still, seeing that there’s a lot of belief in wacky ideas is turned off, I expect that they wouldn’t have been a good fit for the community anyway and so it’s better to “fail fast”.
Thanks, this is really helpful!
Sure thing!