I interpret you as saying: normally the process by which we wind up with “a machine that runs an algorithm” is: first we have some algorithm (information-processing procedure) in mind, and then second we come up with a machine that captures it. Whereas that didn’t happen with brains.
But by the same token, normally the process by which we wind up with “a machine that captures images” is: first we have some idea about lenses and light-sensitive substrates etc., and then we build a camera. The human eye was not “designed” by evolution in that order. But that’s not important—we are still correct to think of a human eye as “designed” to capture images via lenses and light-sensitive substrates.
The design is always implicit not explicit in evolution (“design without a designer”), but it’s still real design.
By the same token, I claim that there are real design principles underlying what the brain does to allow navigation, anticipation of danger, learning new skills, etc., and those same design principles will be findable in some future algorithms textbook (and in some cases, current algorithms textbooks), and the brain is running an algorithm that works due to those principles, just as your eye works due to the principles of optics and lenses. The fact that evolution did not explicitly write out the principles in advance is kinda incidental.
There’s a reason I started out by calling it a nitpick. 😅
I’m not making a claim about the normal way we wind up with “a machine that runs an algorithm”, such that one can just swap in other things for “runs an algorithm”, so it was perhaps a mistake for me to justify it with “you commonly start with...”. My point is more that the hardware-software distinction generalizes to the case of mechanical adders because you start with a logic gate diagram here, but not to the brain because it evolved in a different way.
As an analogy, if one called an eye “a machine that bends light according to an ray optics diagram[1]”, that would be similarly misleading. The question is, I guess, whether “algorithm” means something more like “ray optics diagram” (“a set of instructions to be followed in calculations”) or whether it means something less premeditated.
not sure whether that’s the right term and whether ray optics diagrams are necessarily used for designing cameras...
My full position is a bit subtle, because it’s quite hard to find a materialist-rationalist version of the your statement in the OP that I would fully agree with. The word “design” is kinda objectionable because it implies a designer. Even “if one studied the brain well enough, one would come up with a model that could be used to substitute for the brain with equivalent behavior” is something I’m skeptical of. (But that skepticism is a bit separate from my objection above. Though both objections are motivated by a worry that one goes a bit too quickly from “supernaturalism is false” to “natural things are like artifice”.)
The best I can come up with without coining wholly new words to describe it is to just have a disclaimer, perhaps in the comments like me, pointing out that there’s still a distinction.
Calling it a nitpick because in this case I don’t see any followup errors that would be made as a result of this terminology in this case from this article.
I interpret you as saying: normally the process by which we wind up with “a machine that runs an algorithm” is: first we have some algorithm (information-processing procedure) in mind, and then second we come up with a machine that captures it. Whereas that didn’t happen with brains.
But by the same token, normally the process by which we wind up with “a machine that captures images” is: first we have some idea about lenses and light-sensitive substrates etc., and then we build a camera. The human eye was not “designed” by evolution in that order. But that’s not important—we are still correct to think of a human eye as “designed” to capture images via lenses and light-sensitive substrates.
The design is always implicit not explicit in evolution (“design without a designer”), but it’s still real design.
By the same token, I claim that there are real design principles underlying what the brain does to allow navigation, anticipation of danger, learning new skills, etc., and those same design principles will be findable in some future algorithms textbook (and in some cases, current algorithms textbooks), and the brain is running an algorithm that works due to those principles, just as your eye works due to the principles of optics and lenses. The fact that evolution did not explicitly write out the principles in advance is kinda incidental.
(Sorry if I’m misunderstanding.)
There’s a reason I started out by calling it a nitpick. 😅
I’m not making a claim about the normal way we wind up with “a machine that runs an algorithm”, such that one can just swap in other things for “runs an algorithm”, so it was perhaps a mistake for me to justify it with “you commonly start with...”. My point is more that the hardware-software distinction generalizes to the case of mechanical adders because you start with a logic gate diagram here, but not to the brain because it evolved in a different way.
As an analogy, if one called an eye “a machine that bends light according to an ray optics diagram[1]”, that would be similarly misleading. The question is, I guess, whether “algorithm” means something more like “ray optics diagram” (“a set of instructions to be followed in calculations”) or whether it means something less premeditated.
not sure whether that’s the right term and whether ray optics diagrams are necessarily used for designing cameras...
My full position is a bit subtle, because it’s quite hard to find a materialist-rationalist version of the your statement in the OP that I would fully agree with. The word “design” is kinda objectionable because it implies a designer. Even “if one studied the brain well enough, one would come up with a model that could be used to substitute for the brain with equivalent behavior” is something I’m skeptical of. (But that skepticism is a bit separate from my objection above. Though both objections are motivated by a worry that one goes a bit too quickly from “supernaturalism is false” to “natural things are like artifice”.)
The best I can come up with without coining wholly new words to describe it is to just have a disclaimer, perhaps in the comments like me, pointing out that there’s still a distinction.
Calling it a nitpick because in this case I don’t see any followup errors that would be made as a result of this terminology in this case from this article.