I was reminded of OpenAI’s definition of AGI, a technology that can “outperform humans at most economically valuable work”, and it triggered a pet peeve of mine: It’s tricky to define what it means for something to make up some percentage of “all work”, and OpenAI was painfully vague about what they meant.
More than 50% of people in England used to work in agriculture, and now it’s far less. But if you said that farming machines “can outperform humans at most economically valuable work” it just wouldn’t make sense. Farming machines obviously don’t perform 50% of current economic value.
The most natural definition, in my opinion, is that even after the economy adjusts, AI would still be able to perform >50% of the economically valuable work. Labor income would make up <<50% of GDP, and AI income would make up >50% of GDP. In a simple Acemoglu/Autor task continuum model of the economy where there is a continuum of tasks, this corresponds to AI doing more than 50% of the tasks.
I don’t think OpenAI’s definition is wrong, exactly, since it does have a reasonably natural interpretation. But I really wish they’d been more clear.
I think I disagree. It’s more informative to answer in terms of value as it would be measured today, not value after the economy adjusts.
Suppose someone from 1800 wants to figure out how big a deal mechanized farm equipment will be for humanity. They call up 2025 and ask “How big a portion of your economy is devoted to mechanized farm equipment, or farming enabled by mechanized equipment?” We give them a tiny number. They also ask about top-hats, and we also give them a tiny number. From these tiny numbers they conclude both mechanized farm equipment and top-hats won’t be important for humanity.
EDIT The sort of situation I’m worried about your definition missing is if remote-worker AGI becomes too cheap to meter, but human hands are still valuable.
In that world, I think people wouldn’t say “we have AGI”, right? Since it would be obvious to them that most of what humans do (what they do at that time, which is what they know about) is not yet doable by AI.
Your preferred definition would leave the term AGI open to a scenario where 50% of current tasks get automated gradually using technology similar to current technology (i.e., normal economic growth). It wouldn’t feel like “AGI arrived”, it would feel like “people gradually built more and more software over 50 years that could do more and more stuff”.
I was reminded of OpenAI’s definition of AGI, a technology that can “outperform humans at most economically valuable work”, and it triggered a pet peeve of mine: It’s tricky to define what it means for something to make up some percentage of “all work”, and OpenAI was painfully vague about what they meant.
More than 50% of people in England used to work in agriculture, and now it’s far less. But if you said that farming machines “can outperform humans at most economically valuable work” it just wouldn’t make sense. Farming machines obviously don’t perform 50% of current economic value.
The most natural definition, in my opinion, is that even after the economy adjusts, AI would still be able to perform >50% of the economically valuable work. Labor income would make up <<50% of GDP, and AI income would make up >50% of GDP. In a simple Acemoglu/Autor task continuum model of the economy where there is a continuum of tasks, this corresponds to AI doing more than 50% of the tasks.
I don’t think OpenAI’s definition is wrong, exactly, since it does have a reasonably natural interpretation. But I really wish they’d been more clear.
I think I disagree. It’s more informative to answer in terms of value as it would be measured today, not value after the economy adjusts.
Suppose someone from 1800 wants to figure out how big a deal mechanized farm equipment will be for humanity. They call up 2025 and ask “How big a portion of your economy is devoted to mechanized farm equipment, or farming enabled by mechanized equipment?” We give them a tiny number. They also ask about top-hats, and we also give them a tiny number. From these tiny numbers they conclude both mechanized farm equipment and top-hats won’t be important for humanity.
EDIT The sort of situation I’m worried about your definition missing is if remote-worker AGI becomes too cheap to meter, but human hands are still valuable.
In that world, I think people wouldn’t say “we have AGI”, right? Since it would be obvious to them that most of what humans do (what they do at that time, which is what they know about) is not yet doable by AI.
Your preferred definition would leave the term AGI open to a scenario where 50% of current tasks get automated gradually using technology similar to current technology (i.e., normal economic growth). It wouldn’t feel like “AGI arrived”, it would feel like “people gradually built more and more software over 50 years that could do more and more stuff”.