[Question] What’s the best way to streamline two-party sale negotiations between real humans?

Some types of negotiations are strategyproof; designed such that the optimal strategy is for each player to be truthful. For example in a Vickrey auction, there’s no incentive to lie or bid less than your maximum; doing so would only put you at a disadvantage.

Unfortunately, when it comes to negotiations between a single buyer and a single seller, it’s been proven that there is no strategyproof solution. (See Lying in negotiations: a maximally bad problem.) The seller is always incentivized to overrepresent the value of the item, and the buyer to underrepresent it. This can lead to brinksmanship, where both parties try to set a firm “take it or leave it” price in order to force the other party to accept, at the risk of no deal occurring at all.

Ideally, the correct price at which to sell the item is the price that maximizes utility across both players. But when it comes to real humans in the real world, it’s very easy for one to lie about their own utility curve, so there’s no good way for both parties to enforce this.

The typical way humans go about these negotiations this is with emotional manipulation, extortion, artificial self-restrictions, social coercion, etc. (e.g. think of the stereotypical car salesman.) This seems generally bad for epistemics, and as someone who has to negotiate a lot, I also find it personally very annoying.

I’d like to design a system that allows these negotiations to take place in a more incentive-compatible way that’s faster to execute, doesn’t reward skill at manipulating other people, and is less likely to lead to bad feeling afterwards. Obviously it can’t be perfect and there will be some method of gaming the system, but humans aren’t superintelligences, and if the system can make it hard to calculate the optimal strategy and make it so that providing one’s true valuation doesn’t give the player that much of a disadvantage, I expect that most people will comfortably just fall back on being truthful.

How would you design such a system?

(I have an idea that I’ll share later, but I don’t want to prime people with a specific kind of approach from the beginning.)