Similar issues but less extreme. The degree of concentration of power, misalignment risk, etc. from intelligence amplification would be smaller than from AGI.
It would be slower for sure, at least, being bound to human dynamics. But “same problems but slower” isn’t the same as a solution/alternative. Admittedly better in the limited sense that it’s less likely to end with straight up extinction, but it’s a rather grim world either way.
Since it’s slower, the tech development cycle is faster in comparison. Tech development --> less expensive tech --> more access --> less concetration of power --> more moral outcomes.
Similar issues but less extreme. The degree of concentration of power, misalignment risk, etc. from intelligence amplification would be smaller than from AGI.
It would be slower for sure, at least, being bound to human dynamics. But “same problems but slower” isn’t the same as a solution/alternative. Admittedly better in the limited sense that it’s less likely to end with straight up extinction, but it’s a rather grim world either way.
Since it’s slower, the tech development cycle is faster in comparison. Tech development --> less expensive tech --> more access --> less concetration of power --> more moral outcomes.