If we can put aside for a moment the question of whether Matthew Barnett has good takes, I think it’s worth noting that this reaction reminds me of how outsiders sometimes feel about effective altruism or rationalism:
I guess I feel that his posts tend to be framed in a really strange way such that, even though there’s often some really good research there, it’s more likely to confuse the average reader than anything else and even if you can untangle the frames, I usually don’t find worth it the time.
The root cause may be that there is too much inferential distance, too many differences of basic worldview assumptions, to easily have a productive conversation. The argument contained in any given post might rely on background assumptions that would take a long time to explain and debate. It can be very difficult to have a productive conversation with someone who doesn’t share your basic worldview. That’s one of the reasons that LessWrong encourages users to read foundational material on rationalism before commenting or posting. It’s also why scalable oversight researchers like having places to talk to each other about the best approaches to LLM-assisted reward generation, without needing to justify each time whether that strategy is doomed from the start. And it’s part of why I think it’s useful to create scenes that operate on different worldview assumptions: it’s worth working out the implications of specific beliefs without needing to justify those beliefs each time.
Of course, this doesn’t mean that Matthew Barnett has good takes. Maybe you find his posts confusing not because of inferential distance, but because they’re illogical and wrong. Personally I think they’re good, and I wouldn’t have written this post if I didn’t. But I haven’t actually argued that here, and I don’t really want to—that’s better done in the comments on his posts.
The quoted passage from Chris is actually a beautiful exposition of how Alasdair MacIntyre describes the feeling of encountering reasoning from an alternate “tradition of thought” to which one is an alien: the things that such a tradition says seems alternately obviously true or confusingly framed; the tradition focuses on things you think are unimportant; and the tradition seems apt to confuse people, particularly, of course, the noobs who haven’t learned the really important concepts yet.
MacIntyre talks a lot about how, although traditions of thought make tradition-independent truth claims, adjudicating between the claims of different traditions is typically hard-to-impossible because of different standards of rationality within them. Thus, here’s someone describing MacIntyre.
MacIntyre says [that conflicts between traditions] achieve resolution only when they move through at least two stages: one in which each tradition describes and judges its rivals only in its own terms, and a second in which it becomes possible to understand one’s rivals in their own terms and thus to find new reasons for changing one’s mind. Moving from the first stage to the second “requires a rare gift of empathy as well as of intellectual insight”
This is kinda MacIntyre’s way of talking about what LW talks about as inferential distances—or, as I now tend to think about it, about how pretraining on different corpora gives you very different ontology. I don’t think either of those are really sufficient, though?
I’m not really going anywhere with this comment, I just find MacIntyre’s perspective on this really illuminating, and something I broadly endorse.
I think LW has a pretty thick intellectual tradition at this point, with a pretty thick bundle of both explicit and implicit presuppositions, and it’s unsurprising that people within it just find even very well-informed critiques of it mostly irrelevant, just as it’s unsurprising that a lot of people critiqueing it don’t really seem to actually engage with it. (I do find it frustrating that people within the tradition seem to take this situation as a sign of the truth-speaking nature of LW though.)
Not taking critiques of your methods seriously is a huge problem for truth-speaking. What well-informed critiques are you thinking of? I want to make sure I’ve taken them on board.
Perhaps, although I generally become more sympathetic to someone’s point of view the more I read from them.
And it’s part of why I think it’s useful to create scenes that operate on different worldview assumptions: it’s worth working out the implications of specific beliefs without needing to justify those beliefs each time.
I used to lean more strongly towards more schools of thought being good, however I’ve updated slightly on the margin towards believing thinking some schools of thought just end up muddying the waters.
That said, Epoch has done some great research, so I’m overall happy the scene exists. And I think Matthew Barnett is extremely talented, I just think he’s unfortunately become confused.
If we can put aside for a moment the question of whether Matthew Barnett has good takes, I think it’s worth noting that this reaction reminds me of how outsiders sometimes feel about effective altruism or rationalism:
The root cause may be that there is too much inferential distance, too many differences of basic worldview assumptions, to easily have a productive conversation. The argument contained in any given post might rely on background assumptions that would take a long time to explain and debate. It can be very difficult to have a productive conversation with someone who doesn’t share your basic worldview. That’s one of the reasons that LessWrong encourages users to read foundational material on rationalism before commenting or posting. It’s also why scalable oversight researchers like having places to talk to each other about the best approaches to LLM-assisted reward generation, without needing to justify each time whether that strategy is doomed from the start. And it’s part of why I think it’s useful to create scenes that operate on different worldview assumptions: it’s worth working out the implications of specific beliefs without needing to justify those beliefs each time.
Of course, this doesn’t mean that Matthew Barnett has good takes. Maybe you find his posts confusing not because of inferential distance, but because they’re illogical and wrong. Personally I think they’re good, and I wouldn’t have written this post if I didn’t. But I haven’t actually argued that here, and I don’t really want to—that’s better done in the comments on his posts.
The quoted passage from Chris is actually a beautiful exposition of how Alasdair MacIntyre describes the feeling of encountering reasoning from an alternate “tradition of thought” to which one is an alien: the things that such a tradition says seems alternately obviously true or confusingly framed; the tradition focuses on things you think are unimportant; and the tradition seems apt to confuse people, particularly, of course, the noobs who haven’t learned the really important concepts yet.
MacIntyre talks a lot about how, although traditions of thought make tradition-independent truth claims, adjudicating between the claims of different traditions is typically hard-to-impossible because of different standards of rationality within them. Thus, here’s someone describing MacIntyre.
This is kinda MacIntyre’s way of talking about what LW talks about as inferential distances—or, as I now tend to think about it, about how pretraining on different corpora gives you very different ontology. I don’t think either of those are really sufficient, though?
I’m not really going anywhere with this comment, I just find MacIntyre’s perspective on this really illuminating, and something I broadly endorse.
I think LW has a pretty thick intellectual tradition at this point, with a pretty thick bundle of both explicit and implicit presuppositions, and it’s unsurprising that people within it just find even very well-informed critiques of it mostly irrelevant, just as it’s unsurprising that a lot of people critiqueing it don’t really seem to actually engage with it. (I do find it frustrating that people within the tradition seem to take this situation as a sign of the truth-speaking nature of LW though.)
Not taking critiques of your methods seriously is a huge problem for truth-speaking. What well-informed critiques are you thinking of? I want to make sure I’ve taken them on board.
Perhaps, although I generally become more sympathetic to someone’s point of view the more I read from them.
I used to lean more strongly towards more schools of thought being good, however I’ve updated slightly on the margin towards believing thinking some schools of thought just end up muddying the waters.
That said, Epoch has done some great research, so I’m overall happy the scene exists. And I think Matthew Barnett is extremely talented, I just think he’s unfortunately become confused.