Please correct me if I’m wrong, but it seems to me that you expanded the meaning of “intent” to include any kind of optimization, even things like “he has a white skin, because he intends to catch more vitamin D from sun” describing someone who has no idea what vitamin D is, and where it comes from.
(You also mention robustness, but without specifying timescale. If the atmosphere of the Earth changes, and as a consequence my distant descendants evolve a different skin color, does this still count as our “intent” to keep catching the vitamin D?)
So it seems to me that you are redefining words, with the risk of sneaking in connotations. I mean, if you start using the word “intent” in this sense, it seems likely to me that an inattentive reader would understand it as, well, intent in the usual narrower sense. And we already have enough problems with people misinterpreting the evolutionary-cognitive boundary, economists believing in perfectly rational behavior of everyone, etc.
I don’t think that’s right. As I mention in another comment, Dennett’s notion of the intentional stance is relevant here. More specifically, it provides us with a way to distinguish between cases that Zack intended to include in his concept of “algorithmic intent”, and such cases as the “catch more vitamin D” that you mention. To wit:
The positing of “algorithmic intent” is appropriate in precisely those cases where taking the intentional stance is appropriate (i.e., where—for humans—non-trivial gains in compression of description of a given agent’s behavior may be made by treating the agent’s behavior as intentional [i.e., directed toward some posited goal]), regardless of whether the agent’s conscious mind (if any!) is involved in any relevant decision loops.
Conversely, the positing of “algorithmic intent” is not appropriate in those cases where the design stance or the physical stance suffice (i.e., where no meaningful gains in compression of description of a given agent’s behavior may be made by treating the agent’s behavior as intentional [i.e., directed toward some posited goal]).
Clearly, the “catch more vitamin D” case falls into the latter category, and therefore the term “algorithmic intent” could not apply to it.
Please correct me if I’m wrong, but it seems to me that you expanded the meaning of “intent” to include any kind of optimization, even things like “he has a white skin, because he intends to catch more vitamin D from sun” describing someone who has no idea what vitamin D is, and where it comes from.
(You also mention robustness, but without specifying timescale. If the atmosphere of the Earth changes, and as a consequence my distant descendants evolve a different skin color, does this still count as our “intent” to keep catching the vitamin D?)
So it seems to me that you are redefining words, with the risk of sneaking in connotations. I mean, if you start using the word “intent” in this sense, it seems likely to me that an inattentive reader would understand it as, well, intent in the usual narrower sense. And we already have enough problems with people misinterpreting the evolutionary-cognitive boundary, economists believing in perfectly rational behavior of everyone, etc.
I don’t think that’s right. As I mention in another comment, Dennett’s notion of the intentional stance is relevant here. More specifically, it provides us with a way to distinguish between cases that Zack intended to include in his concept of “algorithmic intent”, and such cases as the “catch more vitamin D” that you mention. To wit:
The positing of “algorithmic intent” is appropriate in precisely those cases where taking the intentional stance is appropriate (i.e., where—for humans—non-trivial gains in compression of description of a given agent’s behavior may be made by treating the agent’s behavior as intentional [i.e., directed toward some posited goal]), regardless of whether the agent’s conscious mind (if any!) is involved in any relevant decision loops.
Conversely, the positing of “algorithmic intent” is not appropriate in those cases where the design stance or the physical stance suffice (i.e., where no meaningful gains in compression of description of a given agent’s behavior may be made by treating the agent’s behavior as intentional [i.e., directed toward some posited goal]).
Clearly, the “catch more vitamin D” case falls into the latter category, and therefore the term “algorithmic intent” could not apply to it.