Project idea: an iterated prisoner’s dilemma competition/​game

Epistemic effort: mostly just thinking out loud. I spend a few dozen minutes thinking about this myself, and then decided to write this up.

After watching this video by Veritasium about game theory I am wondering whether more people having an understanding of game theory—iterated prisoners dilemmas in particular—would lead to more prosocial behavior in the real world.

Here’s an example of where I’m coming from. The video talks about a failure mode where retaliatory strategies like tit-for-tat stumble upon a terrible situation where they endlessly defect against one another. Both are “nice” in the sense that they don’t intend to defect first, but in the real world there are errors. You intend to cooperate but maybe you screw up and accidentally defect instead. Or maybe the other person screws up and misjudges your cooperation as defection. Then they defect in retaliation. Then you defect in retaliation. So on and so forth.

You need a way out of this vicious cycle. One approach would be, instead of retaliating after every defection, you only retaliate 9 out of 10 times.

I think this sort of thing happens in the real world. For example, maybe your partner does something to you that made you unhappy. You feel wronged. So you retaliate by being cold towards them. After all, you don’t want to be a pushover and let such behavior go unpunished.

But your partner never intended to upset you. They don’t realize they did anything wrong. But they notice you being cold towards them. And they don’t want to let that go unpunished. So they retaliate by being cold towards you. Which you notice, and retaliate against by extending the duration for which you’re cold towards them. So on and so forth. This sort of thing can lead to bitterness amongst people that lasts for an embarrassingly long period of time. It can happen amongst not just romantic partners, but amongst friends, family, co-workers, business partners, acquaintances, and more.

Now imagine that you’ve studied game theory. You understand the downsides of endlessly retaliating. Of sliding the “forgiveness” slider all the way to the left until it hits 0. You understand that this strategy is a poor one. It’s obvious to you. You’ve played in iterated prisoner’s dilemma competitions and explored numerous simulations, and it’s just very clear how dumb it is to slide that slider all the way to the left. It seems plausible to me—no, likely—that for such a person, this understanding would translate to real life and lead to more prosocial behavior and better outcomes.

I haven’t thought about it too deeply, but here’s my first approximation of how the game would work:

  • Rather than getting into the weeds and describing your strategy too precisely via code or something, as a user, you are presented with various sliders that you can set.

    • Niceness: how aggressively do you want to try to exploit cooperators by defecting against them and seeing what happens?

    • Retaliatory: how aggressively do you want to punish people who defect against you? Tit for tat? Tit for every two tats? Two tits per tat?[1]

    • Forgiveness: do you hold a grudge? If the other player defects, how long are you going to hold that against them?

    • Clarity: tit-for-tat is a very simple strategy. It’s easy enough for the other players to get a sense of what you’re doing. On the other hand, you can devise complicated strategies that are more opaque. The other players are confused and don’t really see a pattern for when you will cooperate and when you will defect.

  • Tangentially, you can run simulations and see how your strategy performs against “preset” strategies like tit-for-tat, always cooperate, always defect, etc. And you will be presented with explainer text that adds commentary on your strategy, how it performs against other strategies, what is good, what is bad, what real world situations it mirrors, etc.

  • But the main thing is that you play your strategy against everyone else (humans) who are playing the game. Maybe there is a competition once a week. And you see how you do. And there is a leaderboard.

    • Network effects are a potential road block here. It’s fun to play against 10,000 other players, but how do you get to 10,000 other players? There’s a chicken-egg problem.

    • Maybe this could be addressed with an assurance contract?

    • Maybe it’s sufficiently fun to play this even if it is only against a few dozen other people? Especially for early adopters. Maybe that is enough to serve as a way of kinda bootstrapping your way out of the chicken-egg problem?

  • Along the way, like with the simulations in the second bullet point, you are presented with commentary. Like, “I noticed you slid forgiveness to 0. This can lead to the following vicious cycle. Here are some examples throughout history where this occurred, and here are some common personal situations where this occurs.”

  • In a perfect world the UI would be similar to Nicky Case’s The Evolution of Trust. In practice that’s probably way too much work, especially for a version 1. Even for a version 5. A simple UI is probably fine.

  • It’s possible that a more social angle would be appealing. Like instead of playing your strategy against all of the other people playing the game, you just play against a small group of friends. Or maybe you can do both: play against everyone, and play against a small group of friends. I feel like it’s more interesting to see how you do against a large group of people. Personally I wouldn’t be too interested in the small group of friends part, but I’m not really sure how others would feel.

  • It’d probably be a good idea to work with, or at least consult with people who are familiar with game theory. I generally think that it is important to talk to domain experts in these sorts of situations to avoid the scenario where you have blind spots with important issues looming in them and make sure that what you’re doing is directionally correct. And I suspect that with a little bit of hustle, you can at least find some PhD students or hobbyist game theory people who can provide guidance.

  • I don’t see a path towards monetizing this.

    • Will people pay for this? Mayyyybe a small niche of people will pay, I dunno, $10/​month. Somewhere around that order of magnitude. I don’t see $100/​month.

    • $10k/​month is a popular amount of money to target. At $10/​month, you’d need 1,000 users to hit $10k/​month.

    • $10/​month is probably too low a figure to justify using paid ads to do customer acquisition. Definitely too low for cold outreach or sales. So you’d need some sort of passive and organic acquisition strategy. Social media or blogging doesn’t seem right. I think it’d have to be word of mouth. And maybe that could work. I’m imagining that you find a niche of True Fans who loooove game theory and have friends who love game theory and word spreads naturally. I’m skeptical though.

    • To monetize via ads, I think you’d need on the order of tens of millions of users, and this is just too niche for that.

    • You could try to be creative and monetize a different way. Maybe you recognize that the people who play your game tend to be extremely smart, and you can monetize the same way that job boards by serving as a recruiter of sorts to companies. Seems like too hard of a sell though. I’m imagining reaching out to companies and trying to convince them to pay you for this sort of thing and it just being too weird and unconventional for them. And in general, I have a pretty strong heuristic against businesses that require you to get “creative” about monetizing.

I think the best case scenario for this project would be if it somehow raised the sanity waterline, brought us closer to a dath ilanian world, and affected important things like arms races in nuclear weapons and AI. I’m definitely not optimistic about this happening, but it does seem possible.

  1. ^

    There’s a good joke in here somewhere...

No comments.