Yup. That does seem off, but it would make sense if we zoom out a level and assume that the money is paid to a third party, so it’s not really deadweight loss in the broader system context.
Right, but any such trash-car-for-net-win opportunity for Bob will make Alice less likely to make the deal: from her perspective, Bob taking such a win is equivalent to accident/carelessness. In the car case, I’d imagine this is a rare scenario relative to accident/carelessness; in the general case it may not be.
Perhaps a reasonable approach would be to split bills evenly, with each paying 50% and burning an extra k%, where k is given by some increasing function of the total repair cost so far.
I think this gives better incentives overall: with an increasing function, it’s dangerous for Alice to hide problems, given that she doesn’t know Bob will be careful. It’s dangerous for Bob to be careless (or even to drive through swamps for rewards) when he doesn’t know whether there are other hidden problems.
I don’t think you can use the “Or donates to a third party they don’t especially like” version: if trust doesn’t exist, you can’t trust Alice/Bob to tell the truth about which third parties they don’t especially like. You do seem to need to burn the money (and to hope that Alice doesn’t enjoy watching piles of money burn).
Yup. That does seem off, but it would make sense if we zoom out a level and assume that the money is paid to a third party, so it’s not really deadweight loss in the broader system context.
Right, but any such trash-car-for-net-win opportunity for Bob will make Alice less likely to make the deal: from her perspective, Bob taking such a win is equivalent to accident/carelessness. In the car case, I’d imagine this is a rare scenario relative to accident/carelessness; in the general case it may not be.
Perhaps a reasonable approach would be to split bills evenly, with each paying 50% and burning an extra k%, where k is given by some increasing function of the total repair cost so far.
I think this gives better incentives overall: with an increasing function, it’s dangerous for Alice to hide problems, given that she doesn’t know Bob will be careful. It’s dangerous for Bob to be careless (or even to drive through swamps for rewards) when he doesn’t know whether there are other hidden problems.
I don’t think you can use the “Or donates to a third party they don’t especially like” version: if trust doesn’t exist, you can’t trust Alice/Bob to tell the truth about which third parties they don’t especially like.
You do seem to need to burn the money (and to hope that Alice doesn’t enjoy watching piles of money burn).
Ok, I buy this argument.