One obvious reason this might not work as well for you as for Islamic merchants is the scale of the threatened sanction.
Even assuming that the merchant would not face legal penalties for reneging on his note, and even assuming a lack of extrajudicial violence against him, he is still potentially subject to extreme levels of social pressure not available to you today. You cannot prevent a grocery store from selling him food, or a job from employing him. Social networks in a smaller society could potentially do that. You can...threaten to say mean things about him on the internet?
I’m glad this approach has worked for you, but I don’t think it’s game-theoretically sound, I think it is relying on good faith.
As an aside, I’m somewhat skeptical of the story of the merchants’ notes being traded so far afield. How do you avoid forgery? If a note is given and never traded, the merchant can remember to who a favor is owed—if you stay in the same area, people can see who is owed and keep him honest. How can a trader in Zanzibar know whether this is a genuine favor note from a trader in Algiers?
Can I double click on: “I don’t think it’s game-theoretically sound”?
And I don’t know enough about the Islamic merchants to answer that question. I guess the notes where countersigned along the way and so on. But true, it does sound a bit outlandish to modern ears (which is why I thought it’d make a cute opening).
If the person you are dealing with is willing to do the right thing even if it costs them something, a lot of different systems will work.
If you do not trust the person you are dealing with to do the right thing in the absence of incentives, you want a system that will impose incentives that ensure it is in their interest to do the right thing. I’m calling this ‘game-theoretically sound’ to mean ‘the system accomplishes the intended goal even when one or both parties are the kinds of sociopath that prevail in game theory problems’.
This could be formal law (do your end of the contract or the government will punish you).
It could be informal threats (uphold the bargain or me and my mates will clobber you).
In some circumstances it could be reputation (honor your note or no one will deal with you again? refund my purchase or I will review you badly online?).
However, for reputation to actually work as a system that is protected against adversaries/sociopaths (as distinct from working as a system for nice people who are already pretty much trustworthy), you need the damages caused by reputation to exceed the benefits gained by defecting. This is plausibly true for ancient merchants living in small societies, or for companies that have large numbers of customers. I don’t think it’s true for your networks of trust.
That is not necessarily a problem for you! If you’re just dealing with your neighbor, you don’t need your system to be defensible against sociopaths. But if you want to scale up your system, it will become more and more relevant.
Its true, building a network with high barriers to entry is hard to scale, and will never compare to the scalability of a functioning justice system or something like blockchain. And it relies a lot on weeding out sociopaths to function; though the value of belonging to a high trust network can work as an incentive to play fair and not get excluded.
One obvious reason this might not work as well for you as for Islamic merchants is the scale of the threatened sanction.
Even assuming that the merchant would not face legal penalties for reneging on his note, and even assuming a lack of extrajudicial violence against him, he is still potentially subject to extreme levels of social pressure not available to you today. You cannot prevent a grocery store from selling him food, or a job from employing him. Social networks in a smaller society could potentially do that. You can...threaten to say mean things about him on the internet?
I’m glad this approach has worked for you, but I don’t think it’s game-theoretically sound, I think it is relying on good faith.
As an aside, I’m somewhat skeptical of the story of the merchants’ notes being traded so far afield. How do you avoid forgery? If a note is given and never traded, the merchant can remember to who a favor is owed—if you stay in the same area, people can see who is owed and keep him honest. How can a trader in Zanzibar know whether this is a genuine favor note from a trader in Algiers?
Good points.
Can I double click on: “I don’t think it’s game-theoretically sound”?
And I don’t know enough about the Islamic merchants to answer that question. I guess the notes where countersigned along the way and so on. But true, it does sound a bit outlandish to modern ears (which is why I thought it’d make a cute opening).
Attempted expansion:
If the person you are dealing with is willing to do the right thing even if it costs them something, a lot of different systems will work.
If you do not trust the person you are dealing with to do the right thing in the absence of incentives, you want a system that will impose incentives that ensure it is in their interest to do the right thing. I’m calling this ‘game-theoretically sound’ to mean ‘the system accomplishes the intended goal even when one or both parties are the kinds of sociopath that prevail in game theory problems’.
This could be formal law (do your end of the contract or the government will punish you).
It could be informal threats (uphold the bargain or me and my mates will clobber you).
In some circumstances it could be reputation (honor your note or no one will deal with you again? refund my purchase or I will review you badly online?).
However, for reputation to actually work as a system that is protected against adversaries/sociopaths (as distinct from working as a system for nice people who are already pretty much trustworthy), you need the damages caused by reputation to exceed the benefits gained by defecting. This is plausibly true for ancient merchants living in small societies, or for companies that have large numbers of customers. I don’t think it’s true for your networks of trust.
That is not necessarily a problem for you! If you’re just dealing with your neighbor, you don’t need your system to be defensible against sociopaths. But if you want to scale up your system, it will become more and more relevant.
Hope this is clearer, apologies for length.
I appreciated the length!
Its true, building a network with high barriers to entry is hard to scale, and will never compare to the scalability of a functioning justice system or something like blockchain. And it relies a lot on weeding out sociopaths to function; though the value of belonging to a high trust network can work as an incentive to play fair and not get excluded.
Like, if that note you signed gets cashed in on after your death, who pays it?
Also, what if someone forges your signature and makes fake notes?
Those are the questions I have about the written promises.