Basically. The anthropic selection effect is, “you’ll never find yourself in a world where you’re dead”. The generalized pseudo-anthropic selection effect is, “you’ll never find yourself in a world where you’re relatively unable to influence significant decisions”. ‘Course, this is only subjectively true, so it applies to me and not you. But if I were you, it’d apply to you too. But I’m not you, so I can’t honestly endorse this as a norm, because counterfactuals screw with folks’ heads, and people confuse prescriptions and descriptions too easily.
Oh yeah, quantum suicide. The problem is that with sufficiently bad car accidents, the worlds where I survive with, say, a full-body paralysis or something vastly outnumber those where I survive in a reasonably intact body.
ETA: Wait, that’s the “plain” anthropic selection effect. I don’t get how your generalized pseudo-anthropic selection effect is supposed to work.
Basically. The anthropic selection effect is, “you’ll never find yourself in a world where you’re dead”. The generalized pseudo-anthropic selection effect is, “you’ll never find yourself in a world where you’re relatively unable to influence significant decisions”. ‘Course, this is only subjectively true, so it applies to me and not you. But if I were you, it’d apply to you too. But I’m not you, so I can’t honestly endorse this as a norm, because counterfactuals screw with folks’ heads, and people confuse prescriptions and descriptions too easily.
Oh yeah, quantum suicide. The problem is that with sufficiently bad car accidents, the worlds where I survive with, say, a full-body paralysis or something vastly outnumber those where I survive in a reasonably intact body.
ETA: Wait, that’s the “plain” anthropic selection effect. I don’t get how your generalized pseudo-anthropic selection effect is supposed to work.