required intelligence thresholds (for the individual scientists/inventors involved) are only weak evidence about relative problem difficulty (for society, which seems to me the relevant sort of “difficulty” here).
This sounds right, yeah. If I had to guess, I would guess AGI alignment is both kinds of problem (Maxwell/Faraday equations, and rockets).
This sounds right, yeah. If I had to guess, I would guess AGI alignment is both kinds of problem (Maxwell/Faraday equations, and rockets).