Religion for Rationalists—very low, maybe 1/10? It just doesn’t seem like the type of thing that has an easy truth-value to it, which is frustrating. I definitely buy the theoretical argument that religion is instrumentally rational for many people[1], what’s lacking is empirics and/or models. But nothing in the post itself is woo-y.
Symmetry Theory of Valence -- 5-6/10? I dunno I’ve looked into it a bit over the years but it’s far from any of the things I’ve personally deeply studied. They trigger a bunch of red flags; however I’d be surprised but not shocked if it turns out I’m completely wrong here. I know Scott (whose epistemics I broadly trust) and somebody else I know endorses them.
But tbc I’m not the arbiter of what is and is not woo lol.
And I’m open to the instrumental rationality being large enough that it even increases epistemic rationality. Analogy: if you’re a scientist who’s asked to believe a false thing to retain your funding, it might well be worth it even from a purely truth-seeking perspective, though of course it’s a dangerous path.
But tbc I’m not the arbiter of what is and is not woo lol.
Totally. Asked only to get a better model of what you were pointing at.
And now my understanding is that we’re mostly aligned and this isn’t a deep disagreement about what’s valuable, just a labeling and/or style/standard of effort issue.
E.g. Symmetry Theory of Valence seems like the most cruxy example because it combines above-average standard of effort and clarity of reasoning (I believe X, because Y, which could be tested through Z), with a whole bunch of things that I’d agree pass the duck test standard as red flags.
Beneath Psychology sequence too long. Sorry!
Religion for Rationalists—very low, maybe 1/10? It just doesn’t seem like the type of thing that has an easy truth-value to it, which is frustrating. I definitely buy the theoretical argument that religion is instrumentally rational for many people[1], what’s lacking is empirics and/or models. But nothing in the post itself is woo-y.
Symmetry Theory of Valence -- 5-6/10? I dunno I’ve looked into it a bit over the years but it’s far from any of the things I’ve personally deeply studied. They trigger a bunch of red flags; however I’d be surprised but not shocked if it turns out I’m completely wrong here. I know Scott (whose epistemics I broadly trust) and somebody else I know endorses them.
But tbc I’m not the arbiter of what is and is not woo lol.
And I’m open to the instrumental rationality being large enough that it even increases epistemic rationality. Analogy: if you’re a scientist who’s asked to believe a false thing to retain your funding, it might well be worth it even from a purely truth-seeking perspective, though of course it’s a dangerous path.
Totally. Asked only to get a better model of what you were pointing at.
And now my understanding is that we’re mostly aligned and this isn’t a deep disagreement about what’s valuable, just a labeling and/or style/standard of effort issue.
E.g. Symmetry Theory of Valence seems like the most cruxy example because it combines above-average standard of effort and clarity of reasoning (I believe X, because Y, which could be tested through Z), with a whole bunch of things that I’d agree pass the duck test standard as red flags.