Perhaps one effect is that the help of LLMs in writing brings some people from “too incoherent or bad at words to post anything at all, or if they do post it’ll get no attention” to “able to put something together that’s good enough to gain attention”, and the LLMs are just increasing the verbal fluency without improving the underlying logic. Result: we see more illogical stuff being posted, technically as a result of people using LLMs.
Perhaps one effect is that the help of LLMs in writing brings some people from “too incoherent or bad at words to post anything at all, or if they do post it’ll get no attention” to “able to put something together that’s good enough to gain attention”, and the LLMs are just increasing the verbal fluency without improving the underlying logic. Result: we see more illogical stuff being posted, technically as a result of people using LLMs.