[1] We must assume that a FASI would not just reply “You silly creature, becoming omnipotent is not in your best interest so I will not make you omnipotent because I know better” (or an equivalent thereof). If we did, we would implicitly consider the absence of omnipotent beings as evidence for the presence of a FASI. This would force us to consider the eventual presence of omnipotent beings as evidence for the absence of a FASI, which would not make sense.
Nope. The fact that observing near-omnipotent beings would increase the probability of AI doesn’t mean that the probability of near-omnipotent beings is high given AI, it just means that it’s high relative to the probability of observing near-omnipotent beings without the existence of AI.
Also, note that the universe has a lightspeed limit that might well not be breakable, even by superintelligences.
Nope. The fact that observing near-omnipotent beings would increase the probability of AI doesn’t mean that the probability of near-omnipotent beings is high given AI, it just means that it’s high relative to the probability of observing near-omnipotent beings without the existence of AI.
Also, note that the universe has a lightspeed limit that might well not be breakable, even by superintelligences.