The most important thing is “There is a small number of individuals who are paying attention, who you can argue with, and if you don’t like what they’re doing, I encourage you to write blogposts or comments complaining about it. And if your arguments make sense to me/us, we might change our mind. If they don’t make sense, but there seems to be some consensus that the arguments are true, we might lose the Mandate of Heaven or something.”
There’s not, like, anything necessarily wrong with this, on its own terms, but… this is definitely not what “being held accountable” is.
It happening at all already constitutes “going wrong”.
This particular sort of comment doesn’t particularly move me.
All this really means is that you’ll just do with this whatever you feel like doing. Which, again, is not necessarily “wrong”, and really it’s the default scenario for, like… websites, in general… I just really would like to emphasize that “being held accountable” has approximately nothing to do with anything that you’re describing.
As far as the specifics go… well, the bad effect here is that instead of the site being a way for me to read the ideas and commentary of people whose thoughts and writings I find interesting, it becomes just another purveyor of AI “extruded writing product”. I really don’t know why I’d want more of that than there already is, all over the internet. I mean… it’s a bad thing. Pretty straightforwardly. If you don’t think so then I don’t know what to tell you.
All I can say is that this sort of thing drastically reduces my interest in participating here. But then, my participation level has already been fairly low for a while, so… maybe that doesn’t matter very much, either. On the other hand, I don’t think that I’m the only one who has this opinion of LLM outputs.
I in fact don’t use Google very much these days, and don’t particularly recommend that anyone else do so, either.
(If by “google” you meant “search engines in general”, then that’s a bit different, of course. But then, the analogy here would be to something like “carefully select which LLM products you use, try to minimize their use, avoid the popular ones, and otherwise take all possible steps to ensure that LLMs affect what you see and do as little as possible”.)