If rationality is defined as making the decisions that maximise expected utility in a given situation then it is by definition more winninng. The question would be nonsensical.
If another definition of rationality is implied then I don’t think Eleizer demanded that it win.
I do have components of my utility function for certain rituals of cognition (as described in the segment on Fun Theory) but net wins beyond that point would compel me.
Suppose we did the experiments and found other policies more winning than rationality. Would you adopt the most winning policy?
If not, then admit that you value rationality, and stop demanding that it win.
If rationality is defined as making the decisions that maximise expected utility in a given situation then it is by definition more winninng. The question would be nonsensical.
If another definition of rationality is implied then I don’t think Eleizer demanded that it win.
That would be a rational thing to do!
I do have components of my utility function for certain rituals of cognition (as described in the segment on Fun Theory) but net wins beyond that point would compel me.