If we are going to allow you a chance to figure out the behavior of Mu, Mu should be given the chance to find out the behavior of Eliezer(what programs you are likely to produce etc). Only then would information parity be preserved.
Mu is standing in for the entire world, your system is a small bit of it, it is entirely reasonable to expect the world to know more about your system than you do about the behavior of the would. I’m not sure where you are taking this idea, but it is unrealistic in my view.
Optimality through obscurity?
If we are going to allow you a chance to figure out the behavior of Mu, Mu should be given the chance to find out the behavior of Eliezer(what programs you are likely to produce etc). Only then would information parity be preserved.
Mu is standing in for the entire world, your system is a small bit of it, it is entirely reasonable to expect the world to know more about your system than you do about the behavior of the would. I’m not sure where you are taking this idea, but it is unrealistic in my view.