However, my estimate that we might be 10% done is overwhelmingly based on progress in the last ~5 years, which was already accelerated by many of the effects you describe. So a person-year now is almost certainly not actually equivalent to person-year in the 19th or 20th century in objective terms, but that factor is already allowed for in my rough guess of task scope (which has pretty large error bars, even compared to the scale of this effect). I was also explicitly assuming that AI assistance with alignment would likely increase, but that, in the sensitive area of AI Alignment where caution is required, it might be somewhat limited than in most other areas of coding and research. Currently, that’s primarily happening in coding and mathematical assistance, and in that area it’s accelerating rapidly — but we’ve also demonstrated sandbagging and sabotage from misaligned models specifically in this AI Alignment coding task, so I hope people are, like me, monitoring AI coding help particularly carefully in this area.
All good and valid points.
However, my estimate that we might be 10% done is overwhelmingly based on progress in the last ~5 years, which was already accelerated by many of the effects you describe. So a person-year now is almost certainly not actually equivalent to person-year in the 19th or 20th century in objective terms, but that factor is already allowed for in my rough guess of task scope (which has pretty large error bars, even compared to the scale of this effect). I was also explicitly assuming that AI assistance with alignment would likely increase, but that, in the sensitive area of AI Alignment where caution is required, it might be somewhat limited than in most other areas of coding and research. Currently, that’s primarily happening in coding and mathematical assistance, and in that area it’s accelerating rapidly — but we’ve also demonstrated sandbagging and sabotage from misaligned models specifically in this AI Alignment coding task, so I hope people are, like me, monitoring AI coding help particularly carefully in this area.