Is this actually a failure mode though, if you only “compromise” with people you respect intellectually? In retrospect, this sounds kind of like an approximation to Aumann agreement.
Each side should update on the other’s arguments and data, and on the fact that the other side believes what it does (inasfar we can’t perfectly trust our own reasoning process). This often means they update towards the other’s position. But it certainly doesn’t mean they’re going to update so much as to agree on a common position.
You don’t need to try to approximate Aumann agreement because you don’t believe that either yourself or the other party is perfectly rational, so you can’t treat your or the other’s beliefs as having that kind of weight.
Also, people who start out looking for a compromise might be led to compromise in a bad way: A’s theory predicts ball will fall down, B’s theory predicts ball will fall up, compromise theory predicts it will stay in place, even though both A and B have evidence against that.
Each side should update on the other’s arguments and data, and on the fact that the other side believes what it does (inasfar we can’t perfectly trust our own reasoning process). This often means they update towards the other’s position. But it certainly doesn’t mean they’re going to update so much as to agree on a common position.
You don’t need to try to approximate Aumann agreement because you don’t believe that either yourself or the other party is perfectly rational, so you can’t treat your or the other’s beliefs as having that kind of weight.
Also, people who start out looking for a compromise might be led to compromise in a bad way: A’s theory predicts ball will fall down, B’s theory predicts ball will fall up, compromise theory predicts it will stay in place, even though both A and B have evidence against that.