The main reason I don’t reply to most posts is because I’m not guaranteed an interesting conversation, and it is not uncommon that I’d just be explaining a concept which seems obvious if you’ve read the sequences, which aren’t super fun conversations to have compared to alternative uses of my time.
For example, the other day I got into a discussion on LessWrong about whether I should worry about claims which are provably useless, and was accused of ignoring inconvenient truths for not doing so.
If the bar to entry was a lot higher, I think I’d comment more (and I think others would too, like TurnTrout).
Maybe we have different experiences because we tend to read different LW content? I skip most of the AI content, so I don’t have a great sense of the quality of comments there. If most AI discussions get a healthy amount of comments, but those comments are mostly noise, then I can certainly understand your perspective.
The main reason I don’t reply to most posts is because I’m not guaranteed an interesting conversation, and it is not uncommon that I’d just be explaining a concept which seems obvious if you’ve read the sequences, which aren’t super fun conversations to have compared to alternative uses of my time.
For example, the other day I got into a discussion on LessWrong about whether I should worry about claims which are provably useless, and was accused of ignoring inconvenient truths for not doing so.
If the bar to entry was a lot higher, I think I’d comment more (and I think others would too, like TurnTrout).
Maybe we have different experiences because we tend to read different LW content? I skip most of the AI content, so I don’t have a great sense of the quality of comments there. If most AI discussions get a healthy amount of comments, but those comments are mostly noise, then I can certainly understand your perspective.