Yeah, this highlights my overall issue with the OP.
Elon Musk’s path to success is well-known and not replicable. His story relies too much on (1) luck and (2) high-IQ-plus-ultra-high-conscientiousness, in that order of importance. Elon Musk is a red herring in these discussions.
More to the point, there is already an absurd overabundance of available information about how to be quite successful in business. It is not to the comparative advantage of LW to try to replicate this type of content. Likewise, the Internet hosts an absurd overabundance of practical, useful advice on
how to exercise, with the aim of producing any given physical result
how to succeed at dating, to whatever end desired
how to manage one’s personal finances
etc.
It is not the role of LW to comprehensively answer all these questions. LW has always leaned more toward rationality qua rationality. More strategy, less tactics.
Also, I think the OP is attacking a straw man to a large degree. Nobody here thinks that LW has already emmanetized the eschaton. Nobody here thinks the LW has already solved rationality. We’re just a group of people interested in thinking about and discussing these types of considerations.
All that said, when I first discovered LW (and particularly the Sequences), it was such a cognitive bombshell that I did genuinely expect that my life and mind would be completely changed. And that expectation was sort of borne out, but in ways that only make sense in a sort of post hoc fashion. As in, I used LW-inspired-cognition for a lot of major life choices, but it’s impossible to do A/B testing and determine if those were the right choices, because I don’t have access to the world where I made the opposite choice. (People elsewhere in this very comment thread repeat the meme that “LWers are not more rational than average.” Well, how would you know if they were? What does that even mean?)
Yeah, this highlights my overall issue with the OP.
Elon Musk’s path to success is well-known and not replicable. His story relies too much on (1) luck and (2) high-IQ-plus-ultra-high-conscientiousness, in that order of importance. Elon Musk is a red herring in these discussions.
More to the point, there is already an absurd overabundance of available information about how to be quite successful in business. It is not to the comparative advantage of LW to try to replicate this type of content. Likewise, the Internet hosts an absurd overabundance of practical, useful advice on
how to exercise, with the aim of producing any given physical result
how to succeed at dating, to whatever end desired
how to manage one’s personal finances
etc.
It is not the role of LW to comprehensively answer all these questions. LW has always leaned more toward rationality qua rationality. More strategy, less tactics.
Also, I think the OP is attacking a straw man to a large degree. Nobody here thinks that LW has already emmanetized the eschaton. Nobody here thinks the LW has already solved rationality. We’re just a group of people interested in thinking about and discussing these types of considerations.
All that said, when I first discovered LW (and particularly the Sequences), it was such a cognitive bombshell that I did genuinely expect that my life and mind would be completely changed. And that expectation was sort of borne out, but in ways that only make sense in a sort of post hoc fashion. As in, I used LW-inspired-cognition for a lot of major life choices, but it’s impossible to do A/B testing and determine if those were the right choices, because I don’t have access to the world where I made the opposite choice. (People elsewhere in this very comment thread repeat the meme that “LWers are not more rational than average.” Well, how would you know if they were? What does that even mean?)
You could find some people similar to you, and mumble certain incantations.
Just because the example wasn’t well-chosen that doesn’t invalidate the argument per se.