Visions and Mirages: The Sunk Cost Dilemma

Summary

How should a rational agent handle the Sunk Cost Dilemma?

Introduction

You have a goal, and set out to achieve it. Step by step, iteration by iteration, you make steady progress towards completion—but never actually get any closer. You’re deliberately not engaging in the sunk cost fallacy—at no point does the perceived cost of completion get higher. But at each step, you discover another step you didn’t originally anticipate, and had no priors for anticipating.

You’re rational. You know you shouldn’t count sunk costs in the total cost of the project. But you’re now into twice as much effort as you would have originally invested, and have done everything you originally thought you’d need to do, but have just as much work ahead of you as when you started.

Worse, each additional step is novel; the additional five steps you discovered after completing step 6 didn’t add anything to predict the additional twelve steps you added after completing step 19. And after step 35, when you discovered another step, you updated your priors with your incorrect original estimate—and the project is still worth completing. Over and over. All you can conclude is that your original priors were unreliable. Each update to your priors, however, doesn’t change the fact that the remaining cost is always worth paying to complete the project.

You are starting to feel like you are caught in a penny auction for your time.

When do you give up your original goal as a mirage? At what point do you give up entirely?

Solutions

The trivial option is to just keep going. Sometimes this is the only viable strategy; if your goal is mandatory, and there are no alternative solutions to consider. There’s no guarantee you’ll finish in any finite amount of time, however.

One option is to precommit; set a specific level of effort you’re willing to engage in before stopping progress, and possibly starting over from scratch if relevant. When bugfixing someone else’s code on a deadline, my personal policy is to set aside enough time at the end of the deadline to write the code from scratch and debug that (the code I write is not nearly as buggy as that which I’m usually working on). Commitment of this sort can work in situations in which there are alternative solutions or when the goal is disposable.

Another option is to discount sunk costs, but include them; updating your priors is one way of doing this, but isn’t guaranteed to successfully navigate you through the dilemma.

Unfortunately, there isn’t a general solution. If there were, IT would be a very different industry.

Summary

The Sunk Cost Fallacy is best described as a frequently-faulty heuristic. There are game-theoretic ways of extracting value from those who follow a strict policy of avoiding engaging in the Sunk Cost Fallacy which happen all the time in IT—frequent requirement changes to fixed-cost projects are a good example (which can go both ways, actually, depending on how the contract and requirements are structured). It is best to always have an exit policy prepared.

Related Less Wrong Post Links

http://​​lesswrong.com/​​lw/​​at/​​sunk_cost_fallacy/​​ - A description of the Sunk Cost Fallacy

http://​​lesswrong.com/​​lw/​​9si/​​is_sunk_cost_fallacy_a_fallacy/​​ - Arguments that the Sunk Cost Fallacy may be misrepresented

http://​​lesswrong.com/​​lw/​​9jy/​​sunk_costs_fallacy_fallacy/​​ - The Sunk Cost Fallacy can be easily used to rationalize giving up

ETA: Post Mortem

Since somebody has figured out the game now, an explanation: Everybody who spent time writing a comment insisting you -could- get the calculations correct, and the imaginary calculations were simply incorrect? I mugged you. The problem is in doing the calculations -instead of- trying to figure out what was actually going on. You forgot there was another agent in the system with different objectives from your own. Here, I mugged you for a few seconds or maybe minutes of your time; in real life, that would be hours, weeks, months, or your money, as you keep assuming that it’s your own mistake.

Maybe it is a buggy open-source library that has a bug-free proprietary version you pay for—get you in the door, then charge you money when it’s more expensive to back out than to continue. Maybe it’s somebody who silently and continually moves work to your side of the fence on a collaborative project, when it’s more expensive to back out than to continue. Not counting all your costs opens you up to exploitative behaviors which add costs at the back-end.

In this case I was able to mug you in part because you didn’t like the hypothetical, and fought it. Fighting the hypothetical will always reveal something about yourself—in this case, fighting the hypothetical revealed that you were exploitable.

In real life I’d be able to mug you because you’d assume someone had fallen prone to the Planning Fallacy, as you assumed must have happened in the hypothetical. In the case of the hypothetical, an evil god—me—was deliberately manipulating events so that the project would never be completed (Notice what role the -author- of that hypothetical played in that hypothetical, and what role -you- played?). In real life, you don’t need evil gods—just other people who see you as an exploitable resource, and will keep mugging you until you catch on to what they’re doing.