if you have 2 AI’s that have entirely opposite utility functions, yet which assign different probabilities to events, they can work together in ways you don’t want
That is a good point, and this can indeed happen. If I believe something is a piece of chocolate while you—hating me—believe it is poison, we will happily coordinate towards me eating it.
I was assuming that the AIs are copies of each other, which would eliminate most of these cases. (The remaining cases would be when the two AIs somehow diverge during the debate. I totally don’t see how this would happen, but that isn’t a particularly strong argument.)
Also, the debaters better be comparably smart.
Yes, this seems like a necessary assumption in a symmetric debate.
Once again, this is trivially satisfied if the debaters are copies of each other.
It is interesting to note that this assumption might not be sufficient because even if the debate has symmetric rules, the structure of claims might not be. (That is, there is the thing with false claims that are easier to argue for than against, or potentially with attempted human-hacks that are easier to pull off than prevent.)
That is a good point, and this can indeed happen. If I believe something is a piece of chocolate while you—hating me—believe it is poison, we will happily coordinate towards me eating it. I was assuming that the AIs are copies of each other, which would eliminate most of these cases. (The remaining cases would be when the two AIs somehow diverge during the debate. I totally don’t see how this would happen, but that isn’t a particularly strong argument.)
Yes, this seems like a necessary assumption in a symmetric debate. Once again, this is trivially satisfied if the debaters are copies of each other. It is interesting to note that this assumption might not be sufficient because even if the debate has symmetric rules, the structure of claims might not be. (That is, there is the thing with false claims that are easier to argue for than against, or potentially with attempted human-hacks that are easier to pull off than prevent.)