I agree that late in the singularity, AI workflows may be so different from humans’ that we learn very little from extrapolating from human returns to software R&D, but I expect that early in the takeoff, AIs may look significantly more like large-scale human labor (especially if they are still largely managed by humans). If existing returns are insufficient for a takeoff, that should update us against a software-only takeoff in general because it makes initial compounding less likely.
I also expect to observe relevant returns in the near future as AIs increasingly automate AI R&D (many of the points in the above post would include this). Early automation may give us some evidence on dynamics mid-takeoff.
I agree that late in the singularity, AI workflows may be so different from humans’ that we learn very little from extrapolating from human returns to software R&D, but I expect that early in the takeoff, AIs may look significantly more like large-scale human labor (especially if they are still largely managed by humans). If existing returns are insufficient for a takeoff, that should update us against a software-only takeoff in general because it makes initial compounding less likely.
I also expect to observe relevant returns in the near future as AIs increasingly automate AI R&D (many of the points in the above post would include this). Early automation may give us some evidence on dynamics mid-takeoff.