Related: in my ideal world there would be a wrapper version of LessWrong which is like the alignment forum (just focused on transformative AI) but where anyone can post. By default, I’d probably end up recommending people interested in AI go to this because the other content on lesswrong isn’t relevant to them.
One proposal for this:
Use a separate url (e.g. aiforum.com or you could give up on the alignment forum as is and use that existing url).
This is a shallow wrapper on LW in the same way the alignment forum is, but anyone can post.
All posts tagged with AI are crossposted (and can maybe be de-crossposted by a moderator if it’s not actually relevant). (And if you post using the separate url, it automatically is also on LW and is always tagged with AI.)
Maybe you add some mechanism for tagging quick takes or manually cross posting them (similar to how you can cross post quick takes to alignment forum now).
Ideally the home page of the website default to having some more visual emphasis on key research as well as key explanations/intros rather than as much focus on latest events.
Related: in my ideal world there would be a wrapper version of LessWrong which is like the alignment forum (just focused on transformative AI) but where anyone can post. By default, I’d probably end up recommending people interested in AI go to this because the other content on lesswrong isn’t relevant to them.
One proposal for this:
Use a separate url (e.g. aiforum.com or you could give up on the alignment forum as is and use that existing url).
This is a shallow wrapper on LW in the same way the alignment forum is, but anyone can post.
All posts tagged with AI are crossposted (and can maybe be de-crossposted by a moderator if it’s not actually relevant). (And if you post using the separate url, it automatically is also on LW and is always tagged with AI.)
Maybe you add some mechanism for tagging quick takes or manually cross posting them (similar to how you can cross post quick takes to alignment forum now).
Ideally the home page of the website default to having some more visual emphasis on key research as well as key explanations/intros rather than as much focus on latest events.
Yeah, I think something like this might make sense to do one of these days. I am not super enthused with the current AI Alignment Forum setup.