I’m not fully convinced by the salary argument, especially with quality-of-life adjustment. As an example, let’s imagine I’m a skilled post-PhD ML engineer, deciding between:
Jane Street Senior ML Engineer: $700-750k, 50-55hrs/week, medium job security, low autonomy
[Harvard/Yale/MIT] Tenured ML Professor: $200-250k, 40-45hrs/week, ultra-high job security, high autonomy
A quick google search says that my university grants tenure to about 20 people per year. Especially as many professors have kids, side jobs, etc. it seems unlikely that a top university really can’t find 20 good people across all fields who are both good teachers and would take the second option (in fact, I would guess that being a good teacher predisposes you to taking the second option). Is there some part of the tradeoff I’m missing?
I agree that this is the case (and indeed, a quick google search of even my worst professors yields considerably impressive CVs). I don’t understand why that’s the case. Is it, as ErickBall suggests, simply cheaper to hire good researchers than good teachers? I find that a little unlikely. I also find it unlikely that this is more profitable—surely student tuition + higher alumni donations be worth more than whatever cut of NIH/NSF/etc. funding they’re taking.
My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching. Other than maybe the science journals or something, who has a stake in perpetuating this?