no system of interesting complexity, including humans, actually attains long-term goals so much as it simply tries to null out the difference between its (evolving) internal model and its perceptions of its present reality.
That’s no reason not to talk about goals, and instead only mention something like “utility”. Humans are psychologically goal-oriented—i.e. if you talk about goals, people understand what you mean.
Talk about “goals” can be formally translated into talk about “utility”, by considering utility to be estimated proximity to your goals. Whether goals are attained or not is a side issue. You can still discuss and model conquering the universe without actually doing it. So: no need to taboo “goals”.
That’s no reason not to talk about goals, and instead only mention something like “utility”. Humans are psychologically goal-oriented—i.e. if you talk about goals, people understand what you mean.
Talk about “goals” can be formally translated into talk about “utility”, by considering utility to be estimated proximity to your goals. Whether goals are attained or not is a side issue. You can still discuss and model conquering the universe without actually doing it. So: no need to taboo “goals”.