Nevertheless, there is a pretty serious problem here if you believe (as I do) that a large part of what made LW great early on in terms of rationality content was selection effects causing the best and most insightful rationality-interested writers to move here: people like Eliezer (obviously), Robin Hanson, Anna Salamon, Scott back when he was known as Yvain, gwern, (later on) Duncan Sabien etc.
Once the well of insightful new blood starts running dry (because the “lowest hanging fruit” potential contributors have already been attracted to the site) and the old guard starts retiring/moving on, keeping the lifeline going depends more on in-house training and “building” the culture and community to turn rationality learners into dojo teachers. (In my understanding, something similar caused the death of LW 1.0 as well.)
Not only is this hard in theory and mostly hasn’t panned out in practice, it also doesn’t seem to have been prioritized all that much (has LW-rationality been marketed as the common interest of many causes and types of individuals instead of just narrowly appealing to nerdy CS-inclined young STEM westerners? has there been an emphasis on raising the sanity waterline globally, or have projects marketed as doing that instead actually focused only on building narrow pipelines for math talent to join MIRI? was the challenge of writing Level 2 to the Sequences, as Eliezer approved of, actually undergone, or was it all left to Eliezer himself to come back and partly write it?).
This certainly sounds correct.
Nevertheless, there is a pretty serious problem here if you believe (as I do) that a large part of what made LW great early on in terms of rationality content was selection effects causing the best and most insightful rationality-interested writers to move here: people like Eliezer (obviously), Robin Hanson, Anna Salamon, Scott back when he was known as Yvain, gwern, (later on) Duncan Sabien etc.
Once the well of insightful new blood starts running dry (because the “lowest hanging fruit” potential contributors have already been attracted to the site) and the old guard starts retiring/moving on, keeping the lifeline going depends more on in-house training and “building” the culture and community to turn rationality learners into dojo teachers. (In my understanding, something similar caused the death of LW 1.0 as well.)
Not only is this hard in theory and mostly hasn’t panned out in practice, it also doesn’t seem to have been prioritized all that much (has LW-rationality been marketed as the common interest of many causes and types of individuals instead of just narrowly appealing to nerdy CS-inclined young STEM westerners? has there been an emphasis on raising the sanity waterline globally, or have projects marketed as doing that instead actually focused only on building narrow pipelines for math talent to join MIRI? was the challenge of writing Level 2 to the Sequences, as Eliezer approved of, actually undergone, or was it all left to Eliezer himself to come back and partly write it?).