In the scenario I’m imagining, it doesn’t apply because you don’t fully realize/propagate the fact that you’re filtering evidence for yourself. This is partly because the evidence-filtering strategy is smart enough to filter out evidence about its own activities, and partly just because agency is hard and you don’t fully propagate everything by default.
I’m intending this mostly as an ‘internal’ version of “perverse anti-epistemic social pressures”. There’s a question of why this would exist at all (since it doesn’t seem adaptive). My current guess is, some mixture of perverse anti-epistemic social pressures acting on evolutionary timelines, and (again) “agency is hard”—it’s plausible that this kind of thing emerges accidentally from otherwise useful mental architectures, and doesn’t have an easy and universally applicable fix.
In the scenario I’m imagining, it doesn’t apply because you don’t fully realize/propagate the fact that you’re filtering evidence for yourself. This is partly because the evidence-filtering strategy is smart enough to filter out evidence about its own activities, and partly just because agency is hard and you don’t fully propagate everything by default.
I’m intending this mostly as an ‘internal’ version of “perverse anti-epistemic social pressures”. There’s a question of why this would exist at all (since it doesn’t seem adaptive). My current guess is, some mixture of perverse anti-epistemic social pressures acting on evolutionary timelines, and (again) “agency is hard”—it’s plausible that this kind of thing emerges accidentally from otherwise useful mental architectures, and doesn’t have an easy and universally applicable fix.