(That drawing of the Dunning-Kruger Effect is a popular misconception—there was a post last week on that, see also here.)
I think there’s “if you have a hammer, everything looks like a nail” stuff going on. Economists spend a lot of time thinking about labor automation, so they often treat AGI as if it will be just another form of labor automation. LLM & CS people spend a lot of time thinking about the LLMs of 2025, so they often treat AGI as if it will be just like the LLMs of 2025. Military people spend a lo of time thinking about weapons, so they often treat AGI as if it will be just another weapon. Etc.
So yeah, this post happens to be targeted at economists, but that’s not because economists are uniquely blameworthy, or anything like that.
(That drawing of the Dunning-Kruger Effect is a popular misconception—there was a post last week on that, see also here.)
I think there’s “if you have a hammer, everything looks like a nail” stuff going on. Economists spend a lot of time thinking about labor automation, so they often treat AGI as if it will be just another form of labor automation. LLM & CS people spend a lot of time thinking about the LLMs of 2025, so they often treat AGI as if it will be just like the LLMs of 2025. Military people spend a lo of time thinking about weapons, so they often treat AGI as if it will be just another weapon. Etc.
So yeah, this post happens to be targeted at economists, but that’s not because economists are uniquely blameworthy, or anything like that.