This seems like a tortured definition. There are many humans who haven’t thought seriously about their goals in life. Some of these humans are reasonably bright people. It would be a very counterintuitive definition of “intelligent” that said that a machine that thought and communicated as well as a very-well-read and high-IQ 13-year-old wasn’t intelligence.
There are many humans who haven’t thought seriously about their goals in life.
They’re not intelligent on a large scale. That is, knowing how they act on the short term would be a more effective way to find out how they end up than knowing what their goals are. They still may have short-term intelligence, in that knowing how they act on the very short term would be less useful to predicting what happens than what they want on the short term. For example, you’d be much better off predicting the final state of a chess game against someone who’s much better than you at chess by how they want it to end than by what their moves are.
It’s not impossible to accomplish your goals without knowing what they are, but it helps. If you’re particularly intelligent in regards to some goal, you’d probably do it. Nobody knows their goals exactly, but they generally know it enough to be helpful.
It would be a very counterintuitive definition of “intelligent” that said that a machine that thought and communicated as well as a very-well-read and high-IQ 13-year-old wasn’t intelligence.
Very few possible conversations sound remotely like a very-well-read high-IQ 13-year-old. Anything that can do it is either a very good optimization process, or just very complicated (such as a recording of such a conversation).
Edit:
I suppose I should probably justify why this definition of intelligence is useful here. When we try to build an AI, what we want is something that will accomplish our goals. It will make sure that, of all possible universes, the one that happens is particularly high on our utility ranking. In short, it must be intelligent with respect to our goals by the definition of intelligence I just gave. If it can hold a conversation like a 13-year-old, but can’t do anything useful, we don’t want it.
If the AI is an optimization process, it will try to find out what it’s optimizing explicitly. If not, it’s not intelligent.
This seems like a tortured definition. There are many humans who haven’t thought seriously about their goals in life. Some of these humans are reasonably bright people. It would be a very counterintuitive definition of “intelligent” that said that a machine that thought and communicated as well as a very-well-read and high-IQ 13-year-old wasn’t intelligence.
They’re not intelligent on a large scale. That is, knowing how they act on the short term would be a more effective way to find out how they end up than knowing what their goals are. They still may have short-term intelligence, in that knowing how they act on the very short term would be less useful to predicting what happens than what they want on the short term. For example, you’d be much better off predicting the final state of a chess game against someone who’s much better than you at chess by how they want it to end than by what their moves are.
It’s not impossible to accomplish your goals without knowing what they are, but it helps. If you’re particularly intelligent in regards to some goal, you’d probably do it. Nobody knows their goals exactly, but they generally know it enough to be helpful.
Very few possible conversations sound remotely like a very-well-read high-IQ 13-year-old. Anything that can do it is either a very good optimization process, or just very complicated (such as a recording of such a conversation).
Edit:
I suppose I should probably justify why this definition of intelligence is useful here. When we try to build an AI, what we want is something that will accomplish our goals. It will make sure that, of all possible universes, the one that happens is particularly high on our utility ranking. In short, it must be intelligent with respect to our goals by the definition of intelligence I just gave. If it can hold a conversation like a 13-year-old, but can’t do anything useful, we don’t want it.