All the things you mentioned seemed pretty goal-directed to me. Evolution has only been relatively short on goals because it has been so primitive up until now. It is easy to see systematic ways in which agents we build will not be like evolution.
It is true that not all aspects of these things are goal-directed. Some aspects of behaviour are meaningless and random—for example.
My point is that evolution IS a superintelligence and we should use it as a model for what other superintelligences might look like.
Reality doesn’t care how you abuse terminology. A GAI still isn’t going to act like evolution.
All the things you mentioned seemed pretty goal-directed to me. Evolution has only been relatively short on goals because it has been so primitive up until now. It is easy to see systematic ways in which agents we build will not be like evolution.
It is true that not all aspects of these things are goal-directed. Some aspects of behaviour are meaningless and random—for example.