Values Weren’t Complex, Once.

The central argument of this post is that human values are only complex because all the obvious constraints and goals are easily fulfilled. The resulting post-optimization world is deeply confusing, and leads to noise as the primary driver of human values. This has worrying implications for any kind of world-optimizing. (This isn’t a particularly new idea, but I am taking it a bit farther and/​or in a different direction than this post by Scott Alexander, and I think it is worth making clear, given the previously noted connection to value alignment and effective altruism.)

First, it seems clear that formerly simple human values are now complex. “Help and protect relatives, babies, and friends” as a way to ensure group fitness and survival is mostly accomplished, so we find complex ethical dilemmas about the relative values of different behavior. “Don’t hurt other people” as a tool for ensuring reciprocity has turned into compassion for humanity, animals, and perhaps other forms of suffering. These are more complex than they could possibly have been expressed in the ancestral environment, given restricted resources. It’s worth looking at what changed, and how.

In the ancestral environment, humans had three basic desires; they wanted food, fighting, and fornication. Food is now relatively abundant, leading to people’s complex preferences about exactly which flavors they like most. These differ because the base drive for food is overoptimizing. Fighting was competition between people for resources—and since we all have plenty, this turns into status-seeking in ways that aren’t particularly meaningful outside of human social competition. The varieties of signalling and counter-signalling are the result. And fornication was originally for procreation, but we’re adaptation executioners, not fitness maximizers, so we’ve short-cutted that with birth control and pornography, leading to an explosion in seeking sexual variety and individual kinks.

Past the point where maximizing the function has a meaningful impact on the intended result, we see the tails come apart. The goal seeking of human nature, however, needs to find some direction to push the optimization process. The implication from this is that humanity finds diverging goals because they are past the point where the basic desires run out. As Randall Munroe points out in an XKCD Comic, this leads to increasingly complex and divergent preferences for ever less meaningful results. And that comic would be funny if it weren’t a huge problem for aligning group decision making and avoiding longer term problems.

If this is correct, the key takeaway is that as humans find ever fewer things to need, they inevitably to find ever more things to disagree about. Even though we expect convergent goals related to dominating resources, narrowly implying that we want to increase the pool of resources to reduce conflict, human values might be divergent as the pool of such resources grows.