Well actually GAs aren’t that good at algorithms. Because slightly mutating an algorithm usually breaks it, or creates an entirely different algorithm. So the fitness landscape isn’t that gentle.
You can do a bit better if you work with circuits instead. And even better if you make the circuits continuous, so small mutations create small changes in output. And you can optimize these faster with gradient descent instead of GAs.
And then you have neural networks, which are quite successful.
Well actually GAs aren’t that good at algorithms. Because slightly mutating an algorithm usually breaks it, or creates an entirely different algorithm. So the fitness landscape isn’t that gentle.
You can do a bit better if you work with circuits instead. And even better if you make the circuits continuous, so small mutations create small changes in output. And you can optimize these faster with gradient descent instead of GAs.
And then you have neural networks, which are quite successful.