I’m not sure that absent that there actually is competition between countries—what are they even competing on? You’re reasoning as though they compete on economic efficiency, but what causes countries with lower economic efficiency to vanish?
I guess ultimately they’re competing to colonize the universe, or be one of the world powers that have some say in the fate of the universe? Absent military conflict, the less efficient countries won’t disappear, but they’ll fall increasingly behind in control of resources and overall bargaining power, and their opinions just won’t be reflected much in how the universe turns out.
In that case this model would only hold if governments:
Actually think through the long-term implications of AI
Think about this particular argument
Have enough certainty in this argument to actually act upon it
Notably, there aren’t any feedback loops for the thing-being-competed-on, and so natural-selection style optimization doesn’t happen. This makes me much less likely to believe in arguments of the form “The thing-being-competed-on will have a high value, because there is competition”—the mechanism that usually makes that true is natural selection or some equivalent.
I think I oversimplified my model there. Actually competing to colonize/influence the universe will be the last stage, when the long-term implications of AI and of this particular argument will already be clear. Before that, the dynamics would be driven more by things like internal political and economic processes (some countries already have authoritarian governments and would naturally gravitate towards more centralization of power through political means, and others do not have strong laws/institutions to prevent centralization of the economy through market forces), competition for power (such as diplomatic and military power) and prestige (both of which are desired by leaders and voters alike) on the world stage, and direct military conflicts.
All of these forces create pressure towards greater AGI-based centralization, while the only thing pushing against it appears to be political pressure in some countries against centralization of power. If those countries succeed in defending against centralization but fall significantly behind in economic growth as a result, they will end up not influencing the future of the universe much so we might as well ignore them and focus on the others.
I guess ultimately they’re competing to colonize the universe, or be one of the world powers that have some say in the fate of the universe? Absent military conflict, the less efficient countries won’t disappear, but they’ll fall increasingly behind in control of resources and overall bargaining power, and their opinions just won’t be reflected much in how the universe turns out.
In that case this model would only hold if governments:
Actually think through the long-term implications of AI
Think about this particular argument
Have enough certainty in this argument to actually act upon it
Notably, there aren’t any feedback loops for the thing-being-competed-on, and so natural-selection style optimization doesn’t happen. This makes me much less likely to believe in arguments of the form “The thing-being-competed-on will have a high value, because there is competition”—the mechanism that usually makes that true is natural selection or some equivalent.
I think I oversimplified my model there. Actually competing to colonize/influence the universe will be the last stage, when the long-term implications of AI and of this particular argument will already be clear. Before that, the dynamics would be driven more by things like internal political and economic processes (some countries already have authoritarian governments and would naturally gravitate towards more centralization of power through political means, and others do not have strong laws/institutions to prevent centralization of the economy through market forces), competition for power (such as diplomatic and military power) and prestige (both of which are desired by leaders and voters alike) on the world stage, and direct military conflicts.
All of these forces create pressure towards greater AGI-based centralization, while the only thing pushing against it appears to be political pressure in some countries against centralization of power. If those countries succeed in defending against centralization but fall significantly behind in economic growth as a result, they will end up not influencing the future of the universe much so we might as well ignore them and focus on the others.