Be Not Averse to Lost Purposes

tldr: Be not averse to lost purposes, as aversions yield ugh fields. Rather, confess them, and fix them joyously where possible. How to achieve this emotional state? I’m not sure, but the appropriate emotion might be to lost purposes as curiosity is to ignorance.

This got rambly; sorry. This was originally a reply to Those Who Aspire to Perfection. I may have gotten carried away.


I see that learning to notice and avoid Lost Purposes, in your own action, and in the incentive systems that you shape for others, matters deeply. Lost purposes are everywhere, and addressing them promises to be a useful way to improve your effectiveness, and the effectiveness of any incentive systems that you have the ability to modify.

An aversion to lost purposes, though, is probably not a good way to attempt rationality and remain sane, especially in our world. When lost purposes are rife, when they’re everywhere, and you can see them everywhere, then I would expect having a strong negative reaction to be debilitating. Certainly, when I feel powerless to rectify a situation with lost purposes, my own mild aversion debilitates me from thinking clearly about the situation. Strongly negative emotions are depressing, and depression is debilitating. So, if you have a strong aversion to your own lost purposes, and you don’t immediately see ways to deal with them, you’re likely to become ineffective at handling them.

Worse, perhaps, is that a learned aversion to lost purposes is likely to lead you to stop noticing lost purposes, except when they’re painfully obvious. This is generally true when you have a strong negative reaction around noticing anything, according to the idea of Ugh Fields. (In short: if a thought is usually followed by a strong negative reaction, then you will learn to stop having that thought. This learning is entirely subconscious, and can be difficult to uncover.)

So, consider alternatives. We value having an aversion to lost purposes, because then we’re likely to notice lost purposes and feel driven to change them. What else could do the same thing?

One suggestive idea comes as a metaphor to ignorance. Imagine having an aversion to ignorance—feeling bad and shameful whenever you don’t notice something. (I’m sure plenty of people have this, but it seems damaging.) An aversion to ignorance is bad for the same reasons that an aversion to lost purposes is bad—it might lead you to rectify ignorance occasionally, but it mostly leads you to stop noticing ignorance. Humans, though, can also be equipped with curiosity. In certain circumstances, some people are willing to confess their ignorance, even at some cost, so that they can learn new ideas and rectify their ignorance. Moreover, curiosity and its satisfaction inculcate almost entirely positive affect. Seeking, and sometimes playfulness, accompany curiosity, not shame of ignorance.

Is there some analogous mental state that we can inculcate, that leads to viewing lost purposes as something to be joyously repaired, rather than shunned as shameful?

If this viewpoint seems oddly forced, I submit that the forcedness of this viewpoint derives from where we lay the blame for ignorance and for lost purposes. For (non-willful) ignorance, knowledgeable people generally do not blame the ignorant, because ignorance is the default state of knowledge. (If we shamed people for their failure to know and understand things, we wouldn’t expect them to know much at all! Certainly, they wouldn’t seek to learn from us.)

On the other hand, we do intuitively blame people for their actions, which include blaming them for their plans. And, in fact, we blame organizations for their internal lost purposes. However, given the difficulty of designing complex social systems—and the fact that most social systems are more nearly evolved than designed—it’s unreasonable to expect the absence of lost purpose. Just as an experienced programmer expects to find bugs in software, we should expect to find lost purposes in our own complex plans and social structures.

Perhaps, then, part of the trick is to take a different approach to the way we intuitively assign blame and responsibility: they break down, some, in the presence of situations whose consequences are hard to reason about. In these cases, we have to expect people to try things, and tweak their approaches empirically. In these cases, then, we have to be willing to allow mistakes, so long as those mistakes aren’t willfully continued once the consequences are clear—in the same way that we cast no blame on ignorance, but can get indignant about willful ignorance.