Stigmergy and Pickering’s Mangle

Stigmergy is a notion that an agent’s behavior is sometimes best understood as coordinated by the agent’s environment. In particular, social insects build nests, which have a recognizable standard pattern (different patterns for different species). Does the wasp or termite have an idea of what the standard pattern is? Probably not. Instead, the computation inside the insect is a stateless stimulus/​response rule set. The partially-constructed nest catalyzes the next construction step.

An unintelligent “insect” clambering energetically around a convoluted “nest”, with the insect’s local perceptions driving its local modifications is recognizably something like a Turing machine. The system as a whole can be more intelligent than either the (stateless) insect or the (passive) nest. The important computation is the interactions between the agent and the environment.

Theraulaz and Bonabeau have simulated lattice swarms, and gotten some surprisingly realistic wasp-nest-like constructions. A paper is in CiteSeer, but this summary gives a better rapid overview.

Humans modify the environment (e.g. by writing books and storing them in libraries), and human behavior is affected by their environment (e.g. by reading books). Wikipedia is an excellent example of what human-human stigmergic coordination looks like. Instead of interacting directly with one another, each edit leaves a trace, and future edits respond to the trace (this impersonal interaction may avoid some biases towards face-saving and status-seeking).

Andrew Pickering is a sociologist who studies science and technology. He wrote a book called “The Mangle of Practice”. He includes in his sociology non-human “actors”. For example, he would say that a bubble chamber acts on a human observer when the observer sees a tracery of bubbles after a particle physics experiment. This makes his theory less society-centric and more recognizable to a non-sociologist.

As a programmer, the best way I can explain Pickering’s mangle is by reference to programming. In trying to accomplish something using a computer, you start with a goal, a desired “capture of machinic agency”. You interact with the computer, alternating between human-acts-on-computer (edit) phases, and computer-acts-on-human (run) phases. In this process, the computer may display “resistances” and, as a consequence, you might change your goals. Not all things are possible or feasible, and one way that we discover impossibilities and infeasibilities is via these resistances. Pickering would say that your goals have been “mangled”. Symmetrically, the computer program gets mangled by your agency (mangled into existence, even).

Pickering says that all of science and technology can be described by an network including both human and non-human actors, mangling each other over time, and in his book he has some carefully-worked out examples—Donald Glaser’s invention of the bubble chamber, Morpurgo’s experiments measuring a upper bound on the number of free quarks and Hamilton’s invention of quaternions, and a few more.

I hope you find these notions (stigmergy and the mangle) as provocative and intriguing as I do. The rest of this post is my own thoughts, far more speculative and probably not as valuable.

Around each individual is a shell of physical traces that they have made—books that they’ve chosen to keep nearby, mementos and art that they have collected, documents that they’ve written. At larger radiuses, those shells become sparser and intermingle more, but eventually those physical traces comprise a lot of what we call “civilization”. Should a person’s shell of traces be considered integral to their identity?

Most of the dramatic increases in our civilization’s power and knowledge over the past few thousand years have been improvements in these stigmergic traces. Does this suggest that active, deliberate stigmergy is an appropriate self-improvement technique, in rationality and other desirable traits? Maybe exoself software would be a good human rationality-improving project. I wrote a little seed called exomustard, but it doesn’t do much of anything.

Might it be possible for some form of life to exist within the interaction between humans and their environment? Perhaps the network of roads, cars, and car-driving could be viewed as a form of life. If all physical roads and cars were erased, humans would remember them and build them again. If all memory and experience of roads and cars were erased, humans would discover the use of the physical objects quickly. But if both were erased simultaneously, it seems entirely plausible that some other form of transportation would become dominant. Nations are another example. These entities self-catalyze and maintain their existence and some of their properties in the face of change, which are lifelike properties.

What would conflict between a human and a stigmergic, mangled-into-existance “capture of machinic agency” look like? At first glance, this notion seems like some quixotic quest to defeat the idea and existence of automobiles (or even windmills). However, the mangle does include the notion of conflict already, as “resistances”. Some resistances, like the speed of light or the semi-conservation of entropy, we’re probably going to have to live with. Those aren’t the ones we’re interested in. There are also accidental resistances due to choices earlier in the mangling process.

Bob Martin has a paper where he lists some symptoms of bad design—rigidity, fragility, immobility, viscosity. We might informally say “The system *wants* to do such-and-so.”, often some sort of inertia or otherwise continuing on a previous path. These are examples of accidental resistances that humans chose to mangle into existence, and then later regret. Every time you find yourself saying “well, it’s not good, but it’s what we have and it would be too expensive/​risky/​impractical to change”, you’re finding yourself in conflict with a stigmergic pattern.

Paul Grignon has a video “Money as Debt”, that describes a world where we have built an institution gradually over centuries which is powerful and which (homeostatically) defends its own existence, but also (due to its size, power, and accidentally-built-in drive) steers the world toward disaster. The video twigs a lot of my conspiracy-theory sensors, but Paul Grignon’s specific claims are not necessary for the general principle to be sound: we can build institutions into our civilization that subsequently have powerful steering effects on our civilization—steering the civilization into a collision course with a wall, maybe.

In conclusion, stigmergy and Pickering’s mangle are interesting and provocative ideas and might be useful building blocks for techniques to increase human rationality and reduce existential risk.