You just have to think, I will do this, and then my opponent must rationally do that. You have a completely watertight argument. Then your opponent goes and does something else.
A model of reality, which assumes that an opponent must be rational, is an incorrect model. At best, it is a good approximation that could luckily return a correct answer in some situations.
I think this is a frequent bias for smart people—assuming that (1) my reasoning is flawless, and (2) my opponent is on the same rationality level as me, therefore (3) my opponent must have the same model of situation as me, therefore (4) if I rationally predict that it is best for my opponent to do X, my opponent will really do X. And then my opponent does non-X, and I am like: WTF?!
A model of reality, which assumes that an opponent must be rational, is an incorrect model. At best, it is a good approximation that could luckily return a correct answer in some situations.
I think this is a frequent bias for smart people—assuming that (1) my reasoning is flawless, and (2) my opponent is on the same rationality level as me, therefore (3) my opponent must have the same model of situation as me, therefore (4) if I rationally predict that it is best for my opponent to do X, my opponent will really do X. And then my opponent does non-X, and I am like: WTF?!