Analogy to the Heisenberg Uncertainty Principle for Powerful AI?

What do you think? There might be a theoritical limitation to how much data an AI could collect without influencing the data itself and making its prediction redundant. Would this negate the idea of a ‘God’ AIand cause it to make suboptimal choices even with near limitless processing power?