expect such a crisis to have at most modest effects on timelines to existentially dangerous ASI being developed
It may by my lack of economics education speaking, but how can it it be the case? Are current timelines not relying heavily on the ability of the labs to raise huge capital for building huge datacenters and for paying many people who are smarter than current frontier models to manually generate huge amounts of quality data? Wouldn’t such a crisis make it much harder for them, plausibly beyond what makes direct economic sense, due to what responsible investers think a responsible invester is expected to do?
I’m petty sure that when he talks about the damage of knowledge to intelligence he doesn’t mean that the intelligence shouldn’t have this knowledge in generation time. Rather, the issue in training—which you may ver well call a skill issue—is that by default local ad hoc explanations create superficial predictive success and get in the way of more general explanations. So the issue isn’t having less knowledge, but rather having less early memorization.