This assertion suffers from the same problems of shuffling detached handles. You need to be more technical in expressing things like this, otherwise a curious reader is left with no choice other than to invent definitions that can make your assertion either true or false.
A model is a simplified, abstracted representation of an object or system that presents only the information needed by its user. For example, the plastic models of aircraft I built as a kid abstract away everything except the external appearance, a mathematical model of a system shows only those dimensions and relationships useful to the model’s users, a control system is a model of the relationships between the stimuli and the response desired by the designer and user of the larger system being controlled (evolution as designer and organism as user in biological analogy).
A model is a simplified, abstracted representation of an object or system that presents only the information needed by its user.
If I understand this definition correctly, then temperature, as used by a thermostat, is a model of a system: It abstracts away all the details about the energy of individual particles in the system, except for a single scalar value representing the average of all those energies.
I was going to post something in a similar vein, but then I remembered that one of RichardKennaway’s points was about thinking about living beings as control systems. Evolved control systems don’t have designers per se. Whether they do interesting things like account for disturbances or have internal models of the external world depends on the strength of the selection pressure that generated the system.
Evolved systems, though they are not designed, represent an accumulation of evidence of what did and did not work. The process of evolution is a very poor approximation of rationality, and as would be expected, it takes far more evidence for it to produce results than it would take for an ideal Bayesian intelligence, or even an imperfect human designer.
In my analysis, evolution by sexual reproduction can be very good at rationality, collecting information of about 1 bit per generation per individual, because an individual can only be naturally selected or die 1 time.
The factors limiting the learning speed of evolution is the high cost of this information, namely death, and that this is the only kind data going into the system. And the value to be optimized is avoidance of death, which also avoids data gathering. And this optimization function is almost impossible to change.
If the genes were ideal Bayesian intelligences, they would still be limited by this high cost of data gathering. It would be something like this:
Consider yourself in a room. On the wall there is a lot of random data. You can do whatever you want with it, but whatever you do, it will change your chance of dying or being copied with no memory. The problem is that you do not know when you die or are copied. Your task is to decrease your chance of dying. This is tractable mathematically, but I find it somewhat tricky.
Well, perhaps you could reduce the effectiveness of even a Bayesian super intelligence to the level of evolution by restricting the evidence it observes to the evidence that evolution actually uses. But that is not the point.
Evolution ignores a lot of evidence, for example, it does not notice that a gene that confers a small advantage is slowly increasing in frequency and that it would save a lot of time to just give every member of the evolving population a copy of that gene. When a mutation occurs, evolution is incapable of copying that mutation in a hundred organisms to filter out noise from other factors in evaluating its contribution to fitness. An amazingly beneficial mutation could die with the first organism to have it, because of the dumb luck of being targeted by a predator when only a few days old.
For more on the limitations of evolution, and some example of how human intelligence does much better, see Evolutions Are Stupid.
However, evolution is able to test and spread many genes at the same time, thus achieving higher efficiency than the article suggests. Sort of like spread spectrum radio.
I am quite certain its speed is lower than some statistical methods, but not by that much.
I guess at something like a constant factor slower, for doubling gene concentration, as compared to 1 std deviation certainty for the goodness of the gene by Gaussian statistics.
Random binary natural testing of a gene is less accurate than statistics, but it avoids putting counters in the cells for each gene, thus shrinking the cellular machinery necessary for this sort of inference, thus increasing the statistical power per base pair. And I know there are more complicated methods in use for some genes, such as anti bodies, methylation, etc.
And then there is sexual selection, where the organisms use their brains to choose partners. This is even closer to evolution assisted by Bayesian super intelligence.
So I guess that evolutions is not so slow after all.
We can see that intelligent design beats random mutations by quite a stretch—by looking at the acceleration of change due to cultural evolution and technology.
Of course cultural evolution is still a kind of evolution—but intelligent mutations, multi-partner recombination and all the other differences do seem to add up to something pretty substantial.
A control system doesn’t model a system, to a large degree it is the designers’ model of the system it controls.
This assertion suffers from the same problems of shuffling detached handles. You need to be more technical in expressing things like this, otherwise a curious reader is left with no choice other than to invent definitions that can make your assertion either true or false.
A model is a simplified, abstracted representation of an object or system that presents only the information needed by its user. For example, the plastic models of aircraft I built as a kid abstract away everything except the external appearance, a mathematical model of a system shows only those dimensions and relationships useful to the model’s users, a control system is a model of the relationships between the stimuli and the response desired by the designer and user of the larger system being controlled (evolution as designer and organism as user in biological analogy).
If I understand this definition correctly, then temperature, as used by a thermostat, is a model of a system: It abstracts away all the details about the energy of individual particles in the system, except for a single scalar value representing the average of all those energies.
Yes, all of the measurements I can think of off-hand are basically one-dimensional models of a system / object.
I was going to post something in a similar vein, but then I remembered that one of RichardKennaway’s points was about thinking about living beings as control systems. Evolved control systems don’t have designers per se. Whether they do interesting things like account for disturbances or have internal models of the external world depends on the strength of the selection pressure that generated the system.
Evolved systems, though they are not designed, represent an accumulation of evidence of what did and did not work. The process of evolution is a very poor approximation of rationality, and as would be expected, it takes far more evidence for it to produce results than it would take for an ideal Bayesian intelligence, or even an imperfect human designer.
What is your evidence for this assertion?
In my analysis, evolution by sexual reproduction can be very good at rationality, collecting information of about 1 bit per generation per individual, because an individual can only be naturally selected or die 1 time.
The factors limiting the learning speed of evolution is the high cost of this information, namely death, and that this is the only kind data going into the system. And the value to be optimized is avoidance of death, which also avoids data gathering. And this optimization function is almost impossible to change.
If the genes were ideal Bayesian intelligences, they would still be limited by this high cost of data gathering. It would be something like this:
Consider yourself in a room. On the wall there is a lot of random data. You can do whatever you want with it, but whatever you do, it will change your chance of dying or being copied with no memory. The problem is that you do not know when you die or are copied. Your task is to decrease your chance of dying. This is tractable mathematically, but I find it somewhat tricky.
Kim Øyhus
Well, perhaps you could reduce the effectiveness of even a Bayesian super intelligence to the level of evolution by restricting the evidence it observes to the evidence that evolution actually uses. But that is not the point.
Evolution ignores a lot of evidence, for example, it does not notice that a gene that confers a small advantage is slowly increasing in frequency and that it would save a lot of time to just give every member of the evolving population a copy of that gene. When a mutation occurs, evolution is incapable of copying that mutation in a hundred organisms to filter out noise from other factors in evaluating its contribution to fitness. An amazingly beneficial mutation could die with the first organism to have it, because of the dumb luck of being targeted by a predator when only a few days old.
For more on the limitations of evolution, and some example of how human intelligence does much better, see Evolutions Are Stupid.
Very interesting article that.
However, evolution is able to test and spread many genes at the same time, thus achieving higher efficiency than the article suggests. Sort of like spread spectrum radio.
I am quite certain its speed is lower than some statistical methods, but not by that much. I guess at something like a constant factor slower, for doubling gene concentration, as compared to 1 std deviation certainty for the goodness of the gene by Gaussian statistics.
Random binary natural testing of a gene is less accurate than statistics, but it avoids putting counters in the cells for each gene, thus shrinking the cellular machinery necessary for this sort of inference, thus increasing the statistical power per base pair. And I know there are more complicated methods in use for some genes, such as anti bodies, methylation, etc.
And then there is sexual selection, where the organisms use their brains to choose partners. This is even closer to evolution assisted by Bayesian super intelligence.
So I guess that evolutions is not so slow after all.
Kim Øyhus
We can see that intelligent design beats random mutations by quite a stretch—by looking at the acceleration of change due to cultural evolution and technology.
Of course cultural evolution is still a kind of evolution—but intelligent mutations, multi-partner recombination and all the other differences do seem to add up to something pretty substantial.