So far, I genuinely have not gotten much object-level pushback on the most load-bearing points of my sequence, so, I’m not that worried.
I think this probably underestimates the severity of founder effects in this community. A salient example to me is precise Bayesianism (and the cluster of epistemologies that try to “approximate” Bayesianism): As far as I can tell, the rationalist and EA communities went over a decade without anyone within these communities pushing back on the true weak points of this epistemology, which is extremely load-bearing for cause prioritization. I think in hindsight we should have been worried about missing this sort of thing.
If we consider the views of Oxford’s EA philosophers as also having some founding influence on LW and the broader adjacent communities, then it becomes a bit less clear how strongly the founding effect is pointing in the direction of anti-realism.
In any case, I should flag that I no longer think “no object level pushback, therefore I’m not that worried” is a good way of putting it. Instead, I would now put it as follows: “No object level pushback, therefore the burden of proof is no longer on me, and anyone who claims I’m being too confident in my views is on no firmer ground with their position than I am.”
(On whether LW is too stuck within founder effects in general: Your example would be pretty telling and damning if we assume that you’re correct, but my guess is that most readers here will assume you’re wrong about it. Someone in your position could still be right, of course; I’m just saying that this wouldn’t yet be apparent to readers.)
Your example would be pretty telling and damning if we assume that you’re correct, but my guess is that most readers here will assume you’re wrong about it. Someone in your position could still be right, of course; I’m just saying that this wouldn’t yet be apparent to readers.
Fair enough! :) The parallel I had in mind was “[almost] no object level pushback”, or at least almost no object level pushback that I can tell is based on an accurate understanding of my arguments.
Ah, right. It’s not been that long yet, IMO, but if this continues for (say) 2ys in that no one changes their mind but also ~no one engages with the arguments directly and substantively, that would be disappointing.
In your case, the arguments seem more radical, unlike with arguing for anti-realism where one commonly available reason for not engaging much would be people thinking “I probably have similar enough views already.”
For me, epistemology was never my special interest, so I’m not that well-positioned to dive into the topic and try writing a critique or commentary, but I hope that someone else ends up doing it.
I think this probably underestimates the severity of founder effects in this community. A salient example to me is precise Bayesianism (and the cluster of epistemologies that try to “approximate” Bayesianism): As far as I can tell, the rationalist and EA communities went over a decade without anyone within these communities pushing back on the true weak points of this epistemology, which is extremely load-bearing for cause prioritization. I think in hindsight we should have been worried about missing this sort of thing.
If we consider the views of Oxford’s EA philosophers as also having some founding influence on LW and the broader adjacent communities, then it becomes a bit less clear how strongly the founding effect is pointing in the direction of anti-realism.
In any case, I should flag that I no longer think “no object level pushback, therefore I’m not that worried” is a good way of putting it. Instead, I would now put it as follows: “No object level pushback, therefore the burden of proof is no longer on me, and anyone who claims I’m being too confident in my views is on no firmer ground with their position than I am.”
(On whether LW is too stuck within founder effects in general: Your example would be pretty telling and damning if we assume that you’re correct, but my guess is that most readers here will assume you’re wrong about it. Someone in your position could still be right, of course; I’m just saying that this wouldn’t yet be apparent to readers.)
Fair enough! :) The parallel I had in mind was “[almost] no object level pushback”, or at least almost no object level pushback that I can tell is based on an accurate understanding of my arguments.
Ah, right. It’s not been that long yet, IMO, but if this continues for (say) 2ys in that no one changes their mind but also ~no one engages with the arguments directly and substantively, that would be disappointing.
In your case, the arguments seem more radical, unlike with arguing for anti-realism where one commonly available reason for not engaging much would be people thinking “I probably have similar enough views already.”
For me, epistemology was never my special interest, so I’m not that well-positioned to dive into the topic and try writing a critique or commentary, but I hope that someone else ends up doing it.