Curious whether you have any guesses on what would make it seem like a sympathetic decision to the audience
Off-the-cuff idea, probably a bad on:
Stopping short of “turning off commenting entirely”, being able to make comments to a given post subject to a separate stage of filtering/white-listing. The white-listing criteria are set by the author and made public. Ideally, the system is also not controlled by the author directly, but by someone the author expects to be competent at adhering to those criteria (perhaps an LLM, if they’re competent enough at this point).
The system takes direct power out of the author’s hands. They still control the system’s parameters, but there’s a degree of separation now. The author is not engaging in “direct” acts of “tyranny”.
It’s made clear to readers that the comments under a given post have been subject to additional selection, whose level of bias they can estimate by reading the white-listing criteria.
The white-listing criteria are public. Depending on what they are, they can be (a) clearly sympathetic, (b) principled-sounding enough to decrease the impression of ad-hoc acts of tyranny even further.
(Also, ideally, the system doing the selection doesn’t care about what the author wants beyond what they specified in the criteria, and is thus an only boundedly and transparently biased arbiter.)
The commenters are clearly made aware that there’s no guarantee their comments on this post will be accepted, so if they decide to spend time writing them, they know what they’re getting into (vs. bitterness-inducing sequence where someone spends time on a high-effort comment that then gets deleted).
There’s no perceived obligation to respond to comments the author doesn’t want to respond to, because they’re rejected (and ideally the author isn’t even given the chance to read them).
There are no “deleting a highly-upvoted comment” events with terrible optics.
Probably this is still too censorship-y, though? (And obviously doesn’t solve the problem where people make top-level takedown posts in which all the blacklisted criticism is put and then highly upvoted. Though maybe that’s not going to be as bad and widespread as one might fear.)
Off-the-cuff idea, probably a bad on:
Stopping short of “turning off commenting entirely”, being able to make comments to a given post subject to a separate stage of filtering/white-listing. The white-listing criteria are set by the author and made public. Ideally, the system is also not controlled by the author directly, but by someone the author expects to be competent at adhering to those criteria (perhaps an LLM, if they’re competent enough at this point).
The system takes direct power out of the author’s hands. They still control the system’s parameters, but there’s a degree of separation now. The author is not engaging in “direct” acts of “tyranny”.
It’s made clear to readers that the comments under a given post have been subject to additional selection, whose level of bias they can estimate by reading the white-listing criteria.
The white-listing criteria are public. Depending on what they are, they can be (a) clearly sympathetic, (b) principled-sounding enough to decrease the impression of ad-hoc acts of tyranny even further.
(Also, ideally, the system doing the selection doesn’t care about what the author wants beyond what they specified in the criteria, and is thus an only boundedly and transparently biased arbiter.)
The commenters are clearly made aware that there’s no guarantee their comments on this post will be accepted, so if they decide to spend time writing them, they know what they’re getting into (vs. bitterness-inducing sequence where someone spends time on a high-effort comment that then gets deleted).
There’s no perceived obligation to respond to comments the author doesn’t want to respond to, because they’re rejected (and ideally the author isn’t even given the chance to read them).
There are no “deleting a highly-upvoted comment” events with terrible optics.
Probably this is still too censorship-y, though? (And obviously doesn’t solve the problem where people make top-level takedown posts in which all the blacklisted criticism is put and then highly upvoted. Though maybe that’s not going to be as bad and widespread as one might fear.)