Your strength as a rationalist is your ability to be more confused by fiction than by reality.
Does that lead to the conclusion that Newcomb’s problem is irrelevant? Mind-reading aliens are pretty clearly fiction. Anyone who says otherwise is much more likely to be schizophrenic than to have actual information about mind-reading aliens.
I’m pretty sure Eliezer thinks this heuristic should be applied to events that occurred in the past, not ones that will occur in the future—it’s a way of assessing whether a piece of evidence should be trusted or whether we should defy it. It’s also a way of weeding out hypotheses that don’t actually make experimental predictions. I don’t think he’s trying to say that we should ignore things that seem weird, particularly because he speaks out against the absurdity heuristic later on.
Does that lead to the conclusion that Newcomb’s problem is irrelevant? Mind-reading aliens are pretty clearly fiction. Anyone who says otherwise is much more likely to be schizophrenic than to have actual information about mind-reading aliens.
I’m pretty sure Eliezer thinks this heuristic should be applied to events that occurred in the past, not ones that will occur in the future—it’s a way of assessing whether a piece of evidence should be trusted or whether we should defy it. It’s also a way of weeding out hypotheses that don’t actually make experimental predictions. I don’t think he’s trying to say that we should ignore things that seem weird, particularly because he speaks out against the absurdity heuristic later on.
This reminds me of the very first comment on the Pascal’s Mugging post.
Thought experiments are good “to ask how meaningful would someone’s position on an issue be if it were taken to its logical extreme”.