Years after I first thought of it, I continue to think that this chain reaction is the core of what it means for something to be an agent, AND why agency is such a big deal, the sort of thing we should expect to arise and outcompete non-agents. Here’s a diagram:
Roughly, plans are necessary for generalizing to new situations, for being competitive in contests for which there hasn’t been time for natural selection to do lots of optimization of policies. But plans are only as good as the knowledge they are based on. And knowledge doesn’t come a priori; it needs to be learned from data. And, crucially, data is of varying quality, because it’s almost always irrelevant/unimportant. High-quality data, the kind that gives you useful knowledge, is hard to come by. Indeed, you may need to make a plan for how to get it. (Or more generally, being better at making plans makes you better at getting higher-quality data, which makes you more knowledgeable, which makes your plans better.)
I think it’s probably even simpler than that: feedback loops are the minimum viable agent, i.e. a thermostat is the simplest kind of agent possible. Stuff like knowledge and planning are elaborations on the simple theme of the negative feedback circuit.
I disagree; I think we go astray by counting things like thermostats as agents. I’m proposing that this particular feedback loop I diagrammed is really important, a much more interesting phenomenon to study than the more general category of feedback loop that includes thermostats.
Years after I first thought of it, I continue to think that this chain reaction is the core of what it means for something to be an agent, AND why agency is such a big deal, the sort of thing we should expect to arise and outcompete non-agents. Here’s a diagram:
Roughly, plans are necessary for generalizing to new situations, for being competitive in contests for which there hasn’t been time for natural selection to do lots of optimization of policies. But plans are only as good as the knowledge they are based on. And knowledge doesn’t come a priori; it needs to be learned from data. And, crucially, data is of varying quality, because it’s almost always irrelevant/unimportant. High-quality data, the kind that gives you useful knowledge, is hard to come by. Indeed, you may need to make a plan for how to get it. (Or more generally, being better at making plans makes you better at getting higher-quality data, which makes you more knowledgeable, which makes your plans better.)
Seems similar to the OODA loop
Yep! I prefer my terminology but it’s basically the same concept I think.
I think it’s probably even simpler than that: feedback loops are the minimum viable agent, i.e. a thermostat is the simplest kind of agent possible. Stuff like knowledge and planning are elaborations on the simple theme of the negative feedback circuit.
I disagree; I think we go astray by counting things like thermostats as agents. I’m proposing that this particular feedback loop I diagrammed is really important, a much more interesting phenomenon to study than the more general category of feedback loop that includes thermostats.