An Oracular non-AI is a question-answering or otherwise informative system that is not goal-seeking and has no internal parts that are goal-seeking, i.e. not an AI at all. Informally, an Oracular non-AI is something like a “nearly AI-complete calculator” that implements a function from input “questions” to output “answers.”
What if I ask it “What should I do?” or “What would Cthulhu do?” Questions can contain or point to goal-seeking structure, even if non-AI on its own doesn’t. AI may be unable to give an accurate answer, but still engage in goal-seeking behavior, so it’s not a clear reduction of FAI (as you argue in “Oracular non-AIs: Advisors”).
Predictor’s output being an influence on its prediction also reduces to at least an Oracular non-AI, and shows one way in which a “non-AI” can exhibit goal-seeking behavior.
What if I ask it “What should I do?” or “What would Cthulhu do?” Questions can contain or point to goal-seeking structure, even if non-AI on its own doesn’t. AI may be unable to give an accurate answer, but still engage in goal-seeking behavior, so it’s not a clear reduction of FAI (as you argue in “Oracular non-AIs: Advisors”).
Predictor’s output being an influence on its prediction also reduces to at least an Oracular non-AI, and shows one way in which a “non-AI” can exhibit goal-seeking behavior.