Evolution creates complexity, but not remotely close to maximum complexity. Imagine if each individual plant/animal had a radically different design, which would be possible if they weren’t constrained by “survival of the fittest”.
This is true; but I favor systems that can evolve, because they are evolutionarily stable. Systems that aren’t, are likely to be unstable and vulnerable to collapse, and typically have the ethically undesirable property of punishing “virtuous behavior” within that system.
Huh? The purpose of FAI is to achieve the global maximum of whatever utility function we give it.
True. I spoke imprecisely. Life is increasing in complexity, in a meaningful way that is not the same as the negative of entropy, and which I feel comfortable calling “progress” despite Stephen J. Gould’s strident imposition of his sociological agenda onto biology. This is the thing I’m talking about maximizing. Whatever utility function an FAI is given, it’s only going to involve concepts that we already have, which represent a small fraction of possible concepts; and so it’s not going to keep increasing as much in that way.
This is true; but I favor systems that can evolve, because they are evolutionarily stable. Systems that aren’t, are likely to be unstable and vulnerable to collapse, and typically have the ethically undesirable property of punishing “virtuous behavior” within that system.
True. I spoke imprecisely. Life is increasing in complexity, in a meaningful way that is not the same as the negative of entropy, and which I feel comfortable calling “progress” despite Stephen J. Gould’s strident imposition of his sociological agenda onto biology. This is the thing I’m talking about maximizing. Whatever utility function an FAI is given, it’s only going to involve concepts that we already have, which represent a small fraction of possible concepts; and so it’s not going to keep increasing as much in that way.