I feel like you think this is a transgressive boundary or otherwise surprising, whereas I think it mostly seems fine. I’ll write some thoughts regardless.
You write fairly broadly here. I know (broadly) what you’re talking about, and I’m sad about it too, like when I’m regularly in spaces (e.g. CFAR) where I would’ve expected to have common knowledge that everyone’s read the sequences, understands the concept of a technical explanation of a technical explanation, how many bits of information it takes to justify a belief, etc, but I don’t. I don’t feel like “the epistemics are failing” is the coarse-grained description I’d use, I think there’s more details about which bits are going on and why (and which bits actually seem to be going quite excellently!), but I wanted to agree with feeling sad about this particular bit.
many of these ideas have essentially never been justified using the paradigm that the community already operates in
I am not sure whether you feel this way when reading LessWrong though. If you scroll through the curated posts of the last few months, I don’t expect it seems mostly like a lot of obviously terrible ideas are being treated unsceptically (thought you’re welcome to surprise me and say it seems just as bad!).
(A few counterexamples on LessWrong: Oliver wrote an attempted translation for chakras the other day. Kaj’s most popular post (277 karma!) was an attempt to explain a bunch of enlightenment/meditation stuff in non-mysterious term and has a whole interesting sequence offering explicit models behind things like Internal Family Systems. After Scott began a discussion of cultural evolution, Vaniver wrote a post I found fascinating Steelmanning Divination. I wrote in pretty explicit language about my experience circling here. Zvi has written in a ‘postmodernist beat-poem’ style about things that are out to get you and why choices are bad, but also tries to give simple, microeconomic explanations for how systems (like blackmail and facebook) can optimise to destroy all value. Back on the cultural evolution frame, Zvi and Ben have both elucidated explicit models for why the Sabbath is an important institution one should respect.)
So when you talk about weird/mysterious ideas not being explained in an explicit and clear epistemology, I do want to say I think people on LessWrong are often making that effort, and I think we’ve tried to signal that we have higher standards here. It’s okay to write poetry and so on when you’ve not yet learned how to make your idea explicit, but the goal is a technical understanding, that comes with an explicit, communicable model.
My impression is that for the majority of my audience, my efforts to show how everything adds up to normality are redundant, and mostly they’re going by a vague feeling. Overall, it seems to me that there are people trying to do the kind of translational work Davis is asking for, but the community is not, as a whole, applying the sort of discernment that would demand such work. Whether or not this is the problem Davis is trying to identify, I’m worried enough about it that LessWrong has been getting less and less interesting to me as a community to engage with. You’re probably by far the person most worth reaching who isn’t already in my “faction,” such as it is, and Davis is one of the few others who seem to be trying to make sense of things at all.
Overall, it seems to me that there are people trying to do the kind of translational work Davis is asking for, but the community is not, as a whole, applying the sort of discernment that would demand such work.
Agreed, yeah. This is maybe the main thing I’m getting at—I’m trying to shock people into realizing “hey, everything isn’t fine, things are going wrong” and applying more discernment to what’s going on in the community.
I don’t feel like “the epistemics are failing” is the coarse-grained description I’d use, I think there’s more details about which bits are going on and why (and which bits actually seem to be going quite excellently!), but I wanted to agree with feeling sad about this particular bit.
I expect it would be quite useful both here and more generally for you to expand on your model of this.
I feel like you think this is a transgressive boundary or otherwise surprising, whereas I think it mostly seems fine. I’ll write some thoughts regardless.
You write fairly broadly here. I know (broadly) what you’re talking about, and I’m sad about it too, like when I’m regularly in spaces (e.g. CFAR) where I would’ve expected to have common knowledge that everyone’s read the sequences, understands the concept of a technical explanation of a technical explanation, how many bits of information it takes to justify a belief, etc, but I don’t. I don’t feel like “the epistemics are failing” is the coarse-grained description I’d use, I think there’s more details about which bits are going on and why (and which bits actually seem to be going quite excellently!), but I wanted to agree with feeling sad about this particular bit.
I am not sure whether you feel this way when reading LessWrong though. If you scroll through the curated posts of the last few months, I don’t expect it seems mostly like a lot of obviously terrible ideas are being treated unsceptically (thought you’re welcome to surprise me and say it seems just as bad!).
(A few counterexamples on LessWrong: Oliver wrote an attempted translation for chakras the other day. Kaj’s most popular post (277 karma!) was an attempt to explain a bunch of enlightenment/meditation stuff in non-mysterious term and has a whole interesting sequence offering explicit models behind things like Internal Family Systems. After Scott began a discussion of cultural evolution, Vaniver wrote a post I found fascinating Steelmanning Divination. I wrote in pretty explicit language about my experience circling here. Zvi has written in a ‘postmodernist beat-poem’ style about things that are out to get you and why choices are bad, but also tries to give simple, microeconomic explanations for how systems (like blackmail and facebook) can optimise to destroy all value. Back on the cultural evolution frame, Zvi and Ben have both elucidated explicit models for why the Sabbath is an important institution one should respect.)
(Not to mention the great obvious straightforward rationality writing, like Abram’s recent Mistakes With Conservation of Expected Evidence and loads more.)
So when you talk about weird/mysterious ideas not being explained in an explicit and clear epistemology, I do want to say I think people on LessWrong are often making that effort, and I think we’ve tried to signal that we have higher standards here. It’s okay to write poetry and so on when you’ve not yet learned how to make your idea explicit, but the goal is a technical understanding, that comes with an explicit, communicable model.
My impression is that for the majority of my audience, my efforts to show how everything adds up to normality are redundant, and mostly they’re going by a vague feeling. Overall, it seems to me that there are people trying to do the kind of translational work Davis is asking for, but the community is not, as a whole, applying the sort of discernment that would demand such work. Whether or not this is the problem Davis is trying to identify, I’m worried enough about it that LessWrong has been getting less and less interesting to me as a community to engage with. You’re probably by far the person most worth reaching who isn’t already in my “faction,” such as it is, and Davis is one of the few others who seem to be trying to make sense of things at all.
Agreed, yeah. This is maybe the main thing I’m getting at—I’m trying to shock people into realizing “hey, everything isn’t fine, things are going wrong” and applying more discernment to what’s going on in the community.
I expect it would be quite useful both here and more generally for you to expand on your model of this.