I agree with what you said about how we introduce ourselves.
As for your possible improvement, I don’t know if everyone here cares about the latter two points. But it seems that a lot do, and I’m not sure whether the amount of people are over the “threshold” where it makes sense to generalize.
Anyway, I’ve always felt pretty strongly that at its core, the goals of rationality are really simple and straightforward, and that it’s something everyone should be interested in. At it’s core, rationality is just about:
1) Getting what you want.
2) Being right.
Everyone tries to get what they want. Whether it’s good grades, money, health, or altruism, everyone is trying to get what they want. And people generally don’t do such a great job at it. Shouldn’t they want to do a better job?
And everyone wants to be right. Everyone has their opinions on how things really work, and what will happen in the future. But shouldn’t they want to be better at it?
If someone comes up to you and says, “Hey, I’ve got some ideas about how you could do a better job of getting what you want and understanding how the world works. Interested?”. How could you not be interested in those things?!* (One problem might be credibility. Ie. people might respond by saying, “yeah, right”.)
*I sense that a big problem is that “getting what you want” and “understanding how the world works” are Lost Purposes for most people. And so it’s probably good to give an example of each of them that everyone could relate to, and that are actual pain points for people. Something that people are struggling with that they actually want to get better at. Not something that people should want to get better at, but don’t actually want to get better at. But I think it’s important to focus on principles and not sound “self-help-y” (which seems a lot easier said than done).
*I sense that another problem is that people don’t want to identify as a “rationalist”. I get that impression from most people I tell about the site. Other people explicitly say that it feels cult-y.
If someone comes up to you and says, “Hey, I’ve got some ideas about how you could do a better job of getting what you want and understanding how the world works. Interested?”.
If someone does that I would get very very sceptical. Credibility is the problem here—self-help sites are a dime a dozen.
If you’re referred to the site by someone you trust.
Signaling of quality. Ex. mentions of decision theory may signal quality to technically minded people. But there are other things that signal quality to “normal people”. I’d have to think harder about it to come up with good examples.
Design and activity. I’m into startups, and after failing at my first one, I’ve realized how important these things are. Design is important in and of itself (as far as user experience goes), but it’s also important because it signals quality. People often won’t give things with poor design a chance, because they notice a correlation between design and quality. A similar point could be made about activity. Seeing lots of articles and comments serves as social proof of quality.
Proving quality. The “chicken-egg” problem of trustworthiness is encountered everywhere. But quality does seem to win (sort of). I sense that enough people do give stuff a shot such that quality does win out to some extent. If my thinking is on track here, then I think it’d follow that quick wins are important. It’s important to have some “start here” articles that new readers could read and think, “Woah, this is really cool and useful! This definitely isn’t one of those sketchy self-help websites. I’m not sure what the concentration of quality is on this site, but after reading these first two articles I think it’s worth reading a few more to find out”.
Honestly, my impression is that the obstacles of Lost Purposes and not wanting to identify as a rationalist are notably bigger than the obstacle of credibility.
In general I don’t think it makes sense to tell people about LW. It makes much more sense to link someone to an article on LW that’s likely worth reading for that person. If the like what the find, maybe the read more.
I agree with what you said about how we introduce ourselves.
As for your possible improvement, I don’t know if everyone here cares about the latter two points. But it seems that a lot do, and I’m not sure whether the amount of people are over the “threshold” where it makes sense to generalize.
Anyway, I’ve always felt pretty strongly that at its core, the goals of rationality are really simple and straightforward, and that it’s something everyone should be interested in. At it’s core, rationality is just about:
1) Getting what you want.
2) Being right.
Everyone tries to get what they want. Whether it’s good grades, money, health, or altruism, everyone is trying to get what they want. And people generally don’t do such a great job at it. Shouldn’t they want to do a better job?
And everyone wants to be right. Everyone has their opinions on how things really work, and what will happen in the future. But shouldn’t they want to be better at it?
If someone comes up to you and says, “Hey, I’ve got some ideas about how you could do a better job of getting what you want and understanding how the world works. Interested?”. How could you not be interested in those things?!* (One problem might be credibility. Ie. people might respond by saying, “yeah, right”.)
*I sense that a big problem is that “getting what you want” and “understanding how the world works” are Lost Purposes for most people. And so it’s probably good to give an example of each of them that everyone could relate to, and that are actual pain points for people. Something that people are struggling with that they actually want to get better at. Not something that people should want to get better at, but don’t actually want to get better at. But I think it’s important to focus on principles and not sound “self-help-y” (which seems a lot easier said than done).
*I sense that another problem is that people don’t want to identify as a “rationalist”. I get that impression from most people I tell about the site. Other people explicitly say that it feels cult-y.
If someone does that I would get very very sceptical. Credibility is the problem here—self-help sites are a dime a dozen.
True :/
My first thoughts on how it could be mitigated:
If you’re referred to the site by someone you trust.
Signaling of quality. Ex. mentions of decision theory may signal quality to technically minded people. But there are other things that signal quality to “normal people”. I’d have to think harder about it to come up with good examples.
Design and activity. I’m into startups, and after failing at my first one, I’ve realized how important these things are. Design is important in and of itself (as far as user experience goes), but it’s also important because it signals quality. People often won’t give things with poor design a chance, because they notice a correlation between design and quality. A similar point could be made about activity. Seeing lots of articles and comments serves as social proof of quality.
Proving quality. The “chicken-egg” problem of trustworthiness is encountered everywhere. But quality does seem to win (sort of). I sense that enough people do give stuff a shot such that quality does win out to some extent. If my thinking is on track here, then I think it’d follow that quick wins are important. It’s important to have some “start here” articles that new readers could read and think, “Woah, this is really cool and useful! This definitely isn’t one of those sketchy self-help websites. I’m not sure what the concentration of quality is on this site, but after reading these first two articles I think it’s worth reading a few more to find out”.
Honestly, my impression is that the obstacles of Lost Purposes and not wanting to identify as a rationalist are notably bigger than the obstacle of credibility.
In general I don’t think it makes sense to tell people about LW. It makes much more sense to link someone to an article on LW that’s likely worth reading for that person. If the like what the find, maybe the read more.