A closer rationalist analog of your #3 would be Newcomb’s problem, in which you can’t tell “from the inside” whether you’re really in a Newcomblike situation, or Omega’s simulation of you being in that situation, in order for Omega to decide what to put in the sealed box.
A closer rationalist analog of your #3 would be Newcomb’s problem, in which you can’t tell “from the inside” whether you’re really in a Newcomblike situation, or Omega’s simulation of you being in that situation, in order for Omega to decide what to put in the sealed box.