There is no way to pass on traits which is not materially identical, with regards to evolution, to passing on whatever substrate those traits happen to currently reside on—indeed, producing many copies of one’s molecular genes without producing new individuals which are carriers of one’s traits is a failure by the standards of natural selection.
Given that you immediately give an example where they’re not identical, maybe you wanted to say something a little more complicated than “these things are materially identical.”
Anyhow, good post just on the strength of the point about Mendelian genes vs. DNA. An organism that sprays its DNA everywhere is not the sort of thing natural selection optimizes for (except in very special cases where the environment helps the DNA cause more of the organism). That seems obvious, but the implications about traits not being molecular is non-obvious.
Totally don’t buy “But maybe we needed to not be optimizing in order to have the industrial revolution”—how on earth are we supposed to define such a thing, let alone measure it? Meanwhile our current degree of baby production is highly measurable, and we can clearly see that we’re doing way better than chance but way worse than the optimum. Whether this counts as “aligned” or “misaligned” seems to be a matter of interpretation. You can ask how I would feel about an AI that had a similar relationship to its training signal and I’d probably call it ‘inner misaligned’, but the analogy is bad at this.
Good point WRT that first line—I edited it to something more clunky but I think more accurate. Hopefully the intended meaning came across anyway.
WRT the second point—I agree that this is the weakest/most speculative argument in the post, although I still think it’s worth considering. Evolution obviously “had the ability” to make us much more baby-obsessed, or have a higher sex drive, and yet we do not. This indicates that there are tradeoffs to be made; a human with a higher reproductive drive is less fit in other ways. One of those ways is plausibly that a human with a lower reproductive drive gets more “other stuff” done—like maintaining a community, thinking about its environment, and so on—and that “other stuff” is very important for increasing the number of offspring which survive. And, indeed, we have a very important example of some “other stuff” which massively increased the total number of humans alive; it doesn’t seem absurd to suggest that it was no “mistake” for us to have the reproductive drive that we do, and that if God reached down into the world in the distant past and made the straightforward change of “increase the reproductive drive of humans”, this would in fact have made there be fewer humans in the year 2026.
Now, this is all very tangential with regards to the actual analogy being made; it’s unclear what if anything this has to do with AI, in large part due to the many other disanalogies between evolution and AI training. But insofar as all we are doing is judging the capacity of the human species to “fulfill the goal of evolution”, it’s relevant that our drives are what they are in large part because having them that way does “fulfill the goal”, even in part because the drive does not perfectly match the goal.
Given that you immediately give an example where they’re not identical, maybe you wanted to say something a little more complicated than “these things are materially identical.”
Anyhow, good post just on the strength of the point about Mendelian genes vs. DNA. An organism that sprays its DNA everywhere is not the sort of thing natural selection optimizes for (except in very special cases where the environment helps the DNA cause more of the organism). That seems obvious, but the implications about traits not being molecular is non-obvious.
Totally don’t buy “But maybe we needed to not be optimizing in order to have the industrial revolution”—how on earth are we supposed to define such a thing, let alone measure it? Meanwhile our current degree of baby production is highly measurable, and we can clearly see that we’re doing way better than chance but way worse than the optimum. Whether this counts as “aligned” or “misaligned” seems to be a matter of interpretation. You can ask how I would feel about an AI that had a similar relationship to its training signal and I’d probably call it ‘inner misaligned’, but the analogy is bad at this.
Good point WRT that first line—I edited it to something more clunky but I think more accurate. Hopefully the intended meaning came across anyway.
WRT the second point—I agree that this is the weakest/most speculative argument in the post, although I still think it’s worth considering. Evolution obviously “had the ability” to make us much more baby-obsessed, or have a higher sex drive, and yet we do not. This indicates that there are tradeoffs to be made; a human with a higher reproductive drive is less fit in other ways. One of those ways is plausibly that a human with a lower reproductive drive gets more “other stuff” done—like maintaining a community, thinking about its environment, and so on—and that “other stuff” is very important for increasing the number of offspring which survive. And, indeed, we have a very important example of some “other stuff” which massively increased the total number of humans alive; it doesn’t seem absurd to suggest that it was no “mistake” for us to have the reproductive drive that we do, and that if God reached down into the world in the distant past and made the straightforward change of “increase the reproductive drive of humans”, this would in fact have made there be fewer humans in the year 2026.
Now, this is all very tangential with regards to the actual analogy being made; it’s unclear what if anything this has to do with AI, in large part due to the many other disanalogies between evolution and AI training. But insofar as all we are doing is judging the capacity of the human species to “fulfill the goal of evolution”, it’s relevant that our drives are what they are in large part because having them that way does “fulfill the goal”, even in part because the drive does not perfectly match the goal.