In the New York example, it could be that when someone says “Guys, we should really buy those Broadway tickets. The trip to New York is next month already.” they prompt the response “What? I thought we were going the month after!”, hence the disagreement. If this detail had been discussed earlier, there might have been the “February trip” and the “March trip” in order to disambiguate the trip(s) to New York.
I guess I don’t understand what focusing on disagreements adds. Sure, in this situation, the disagreement stems from some people thinking the trip is near (and others thinking it’s farther away). But we already knew that some people think AGI is near and others think it’s farther away! What does observing that people disagree about that stuff add?
What seems to have happened is that people at one point latched on to the concept of AGI, thinking that their interpretation was virtually the same as those of others because of its lack of definition. Again, if they had disagreed with the definition to begin with, they would have used a different word altogether. Now that some people are claiming that AGI is here or here soon, it turns out that the interpretations do in fact differ.
Yeah, I would say that as those early benchmarks (“can beat anyone at chess”, etc.) are achieved without producing what “feels like” AGI, people are forced to make their intuitions concrete, or anyway reckon with their old bad operationalizations of AGI. And that naturally leads to lots of discussion around what actually constitutes AGI. But again, all this is evidence of is that those early benchmarks have been achieved without producing what “feels like” AGI. But we already knew that.
But we already knew that some people think AGI is near and others think it’s farther away!
And what do you conclude based on that?
I would say that as those early benchmarks (“can beat anyone at chess”, etc.) are achieved without producing what “feels like” AGI, people are forced to make their intuitions concrete, or anyway reckon with their old bad operationalizations of AGI.
The relation between the real world and our intuition is an interesting topic. When people’s intuitions are violated (e.g., the Turing test is passed but it doesn’t “feel like” AGI), there’s a temptation to try to make the real world fit the intuition, when it is more productive to accept that the intuition is wrong. That is, maybe achieving AGI doesn’t feel like you expect. But that can be a fine line to walk. In any case, privileging an intuitive map above the actual territory is about as close as you can get to a “cardinal sin” for someone who claims to be rational. (To be clear, I’m not saying you are doing that.)
I guess I don’t understand what focusing on disagreements adds. Sure, in this situation, the disagreement stems from some people thinking the trip is near (and others thinking it’s farther away). But we already knew that some people think AGI is near and others think it’s farther away! What does observing that people disagree about that stuff add?
Yeah, I would say that as those early benchmarks (“can beat anyone at chess”, etc.) are achieved without producing what “feels like” AGI, people are forced to make their intuitions concrete, or anyway reckon with their old bad operationalizations of AGI. And that naturally leads to lots of discussion around what actually constitutes AGI. But again, all this is evidence of is that those early benchmarks have been achieved without producing what “feels like” AGI. But we already knew that.
And what do you conclude based on that?
The relation between the real world and our intuition is an interesting topic. When people’s intuitions are violated (e.g., the Turing test is passed but it doesn’t “feel like” AGI), there’s a temptation to try to make the real world fit the intuition, when it is more productive to accept that the intuition is wrong. That is, maybe achieving AGI doesn’t feel like you expect. But that can be a fine line to walk. In any case, privileging an intuitive map above the actual territory is about as close as you can get to a “cardinal sin” for someone who claims to be rational. (To be clear, I’m not saying you are doing that.)