If you want to steer results, you have to keep an eye on your effects. You need closed-loop control. This is true at all levels of a system. In particular, once you have created a system, you have to pay attention to whether it is achieving its goal, or whether it needs to be redesigned. If you aren’t doing that, you are doing a lousy job steering. You don’t have as much optimization power as might appear at first glance. Or maybe you are doing that, but for some other goal.
I was really impressed by the Kevin Simler tweet (n) that everything that exists is because of a positive feedback loop. It persists because it plays some role in renewing itself. You could call that positive feedback the purpose of the structure. Some things come into existence and then just wind down, failing to gain any traction. For those it is hard to isolate a purpose distinct from that of the creator. But for things that persist, they have not just one creator, but myriad forces that renew them. The purpose of the system is not what it does, but what it does that contributes to its continuing existence. I think this meaningfully distinguishes partial successes from implicit goals. The bus system emitting CO2 does not produce buses next year. Successfully transporting people does. It also creates jobs. If these jobs are form a coherent lobby, they may contribute to the existence of the system, in contrast to the CO2.
This approach (“the purpose of a system is the positive feedback loop that sustains itself”) is a fascinating angle and feels like it has a lot of truth to it.
The weakness is that it’s easy to tell just so stories about why some negative thing some organization does is necessary to sustain it and thus actually it’s purpose. Plus this feels disjoint from the conventional human meaning of the word “purpose” which implies that it is something humans and doing intentionally.
One strength of “The purpose of a system is what it rewards” is that what a system rewards is often something that’s concretely available (provided you can get access to performance evaluation criteria) and something humans can be held accountable for and pressured into changing.
Or to put it another way, I think Simler’s definition is true and fascinating, but mine is probably more useful.
If you want to steer results, you have to keep an eye on your effects. You need closed-loop control. This is true at all levels of a system. In particular, once you have created a system, you have to pay attention to whether it is achieving its goal, or whether it needs to be redesigned. If you aren’t doing that, you are doing a lousy job steering. You don’t have as much optimization power as might appear at first glance. Or maybe you are doing that, but for some other goal.
I was really impressed by the Kevin Simler tweet (n) that everything that exists is because of a positive feedback loop. It persists because it plays some role in renewing itself. You could call that positive feedback the purpose of the structure. Some things come into existence and then just wind down, failing to gain any traction. For those it is hard to isolate a purpose distinct from that of the creator. But for things that persist, they have not just one creator, but myriad forces that renew them. The purpose of the system is not what it does, but what it does that contributes to its continuing existence. I think this meaningfully distinguishes partial successes from implicit goals. The bus system emitting CO2 does not produce buses next year. Successfully transporting people does. It also creates jobs. If these jobs are form a coherent lobby, they may contribute to the existence of the system, in contrast to the CO2.
This approach (“the purpose of a system is the positive feedback loop that sustains itself”) is a fascinating angle and feels like it has a lot of truth to it.
The weakness is that it’s easy to tell just so stories about why some negative thing some organization does is necessary to sustain it and thus actually it’s purpose. Plus this feels disjoint from the conventional human meaning of the word “purpose” which implies that it is something humans and doing intentionally.
One strength of “The purpose of a system is what it rewards” is that what a system rewards is often something that’s concretely available (provided you can get access to performance evaluation criteria) and something humans can be held accountable for and pressured into changing.
Or to put it another way, I think Simler’s definition is true and fascinating, but mine is probably more useful.