I’d offer the counterpoints that:
a) Even at high levels, professors are rarely teaching the absolute cutting edge. With the exception of my AI/ML courses and some of the upper-level CS, I don’t think I’ve learned very much that a professor 10-20 years ago wouldn’t have known. And I would guess that CS is very much the outlier in this regard: I would be mildly surprised if more than 5-10% of undergrads encounter, say, chemistry, economics, or physics that wasn’t already mainstream 50 years ago.
b) Ballpark estimate based on looking at a couple specific schools—maybe 10% of undergrads at a top university go on to a PhD. Universities can (and should) leverage the fact that very few of their students want to go on to do research, and the ones that do will almost all have 4-5 more years of school to learn how to do good research.
If I were running a university, I would employ somewhat standardized curricula for most courses and stipulate that professors must test their students on that material. For the undergrad, I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads. Top researchers would be attracted by the benefit of not having to teach any intro courses, top teachers would be attracted by the benefit of not being pressured to constantly put out research, undergrads would be attracted by the benefit of having competent teachers, and PhD students would be attracted by the more individual attention they get from having research faculty’s full focus. And as a university, the amount of top-tier research being outputted would probably increase, since those people don’t have to teach Bio 101 or whatever.
I contend that this leaves all the stakeholders better off without being more expensive, more difficult, or more resource-intensive. Obviously I’m wrong somewhere, or colleges would just do this, but I’m unsure where...
Update: someone IRL gave me an interesting answer. In high school, we had to take a bunch of standardized tests: AP tests, SAT and ACT, national standardized tests, etc. My school was a public school, so its funding and status was highly dependent on these exam results. This meant that my teachers had a true vested interest in the students actually understanding the content.
Colleges, on the other hand, have no such obligation. Since the same institution is the one administering classes and deciding who gets a degree, there’s super low incentive for them to teach anything, especially since students will typically be willing to teach themselves the skills they need for a job anyway (e.g. all the CS kids grinding leetcode for a FAANG internship). There’s actually so little accountability it’s laughable. And with that little oversight, why would anyone bother being a good teacher?
God, I hate bad incentive structures.