That may be! Unfortunately, for the moment LLMs make it trivial for anyone to generate large amounts of text that require extended attention to evaluate, and so currently LessWrong is flooded with LLM-generated content (like many other venues and people, myself included). In the longer run there will hopefully be better solutions, but at the moment my strategy is to mostly ignore LLM-written content unless it’s from sources that have already established credibility with me in one way or another. Maybe your project will be one of those solutions.
(To be clear, I in no way speak for LW or its moderation team; I’m only passing along my best understanding of the LW policy along with my own opinions)
I really like the comic but of course the actual situation is more complicated. It’s something I’d like to understand better and develop potential solutions for.
That may be! Unfortunately, for the moment LLMs make it trivial for anyone to generate large amounts of text that require extended attention to evaluate, and so currently LessWrong is flooded with LLM-generated content (like many other venues and people, myself included). In the longer run there will hopefully be better solutions, but at the moment my strategy is to mostly ignore LLM-written content unless it’s from sources that have already established credibility with me in one way or another. Maybe your project will be one of those solutions.
(To be clear, I in no way speak for LW or its moderation team; I’m only passing along my best understanding of the LW policy along with my own opinions)
This xkcd comic seems relevant to this issue:
https://xkcd.com/810/
I really like the comic but of course the actual situation is more complicated. It’s something I’d like to understand better and develop potential solutions for.