Aside from the literature on international relations, I don’t know much about academic dysfunction (mostly from reading parts of Inadequate Equilibria, particularly the visitor dialog) and other Lesswrong people can probably cover it better. I think that planecrash, Yud’s second HPMOR-scale work, mentions that everyone in academia just generally avoids citing things published outside of academia, because they risk losing status if they do.
It turns out that Earth economists are locked into powerful incentive structures of status and shame, which prevent them from discussing the economic work of anybody who doesn’t get their paper into a journal. The journals are locked into very powerful incentive structures that prevent them from accepting papers unless they’re written in a very weird Earth way that Thellim can’t manage to imitate, and also, Thellim hasn’t gotten tenure at a prestigious university which means they’ll probably reject the paper anyways. Thellim asks if she can just rent temporary tenure and buy somebody else’s work to write the paper, and gets approximately the same reaction as if she asked for roasted children recipes.
The system expects knowledge to be contributed to it only by people who have undergone painful trials to prove themselves worthy. If you haven’t proven yourself worthy in that way, the system doesn’t want your knowledge even for free, because, if the system acknowledged your contribution, it cannot manage not to give you status, even if you offer to sign a form relinquishing it, and it would be bad and unfair for anyone to get that status without undergoing the pains and trials that others had to pay to get it.
She went and talked about logical decision theory online before she’d realized the full depth of this problem, and now nobody else can benefit from writing it up, because it would be her idea and she would get the status for it and she’s not allowed to have that status. Furthermore, nobody else would put in the huge effort to push forward the idea if she’ll capture their pay in status. It does have to be a huge effort; the system is set up to provide resistance to ideas, and disincentivize people who quietly agreed with those ideas from advocating them, until that resistance is overcome. This ensures that pushing any major idea takes a huge effort that the idea-owner has to put in themselves, so that nobody will be rewarded with status unless they have dedicated several years to pushing an idea through a required initial ordeal before anyone with existing status is allowed to help, thereby proving themselves admirable enough and dedicated enough to have as much status as would come from contributing a major idea.
To suggest that the system should work in any different way is an obvious plot to steal status that is only deserved by virtuous people who work hard, play by the proper rules, and don’t try to cheat by doing anything with less effort than it’s supposed to take.
It’s glowfic, so of course I don’t know how accurate it is as it’s intended to plausibly deniable enough to facilitate free writing (while keeping things entertaining enough to register as not-being-work).
I have to think more about the status dynamics that Eliezer talked about. There’s probably something to it… But this part stands out as wrong or at least needing nuance/explanation:
Thellim hasn’t gotten tenure at a prestigious university which means they’ll probably reject the paper anyways
I think most academic venues do blind reviews and whoever decides whether or not to accept a paper isn’t supposed to know who wrote it? Which isn’t to say that the info won’t leak out anyway and influence the decision. (For example I once left out the acknowledgements section in a paper submission, thinking that, like the author byline, I was supposed to add it after the paper was accepted, but apparently I was actually supposed to include it and someone got really peeved that I didn’t.)
MIRI suggested I point out that Cheating Death In Damascus had recently been accepted in The Journal of Philosophy, a top philosophy journal, as evidence of (hopefully!) mainstream philosophical engagement.
From talking with people who do work on a lot of grant committees in the NIH and similar funding orgs, it’s really hard to do proper blinding of reviews. Certain labs tend to focus on particular theories and methods, repeating variations of the same idea… So if you are familiar the general approach of a particular lab and it’s primary investigator, you will immediately recognize and have a knee-jerk reaction (positive or negative) to a paper which pattern-matches to the work that that lab / subfield is doing.
Common reactions from grant reviewers:
Positive—“This fits in nicely with my friend Bob’s work. I respect his work, I should argue for funding this grant.”
Neutral—“This seems entirely novel to me, I don’t recognize it as connecting with any of the leading trendy ideas in the field or any of my personal favorite subtopics. Therefore, this seems high risk and I shouldn’t argue too hard for it.”
Slightly negative—“This seems novel to me, and doesn’t sound particularly ‘jargon-y’ or technically sophisticated. Even if the results would be beneficial to humanity, the methods seem boring and uncreative. I will argue slightly against funding this.”
Negative—“This seems to pattern match to a subfield I feel biased against. Even if this isn’t from one of Jill’s students, it fits with Jill’s take on this subtopic. I don’t want views like Jill’s gaining more traction. I will argue against this regardless of the quality of the logic and preliminary data presented in this grant proposal.”
I will self-downvote so this isn’t the top comment. Yud’s stuff is neat, but I haven’t read much on the topic, and passing some along when it comes up has been a good general heuristic.
Aside from the literature on international relations, I don’t know much about academic dysfunction (mostly from reading parts of Inadequate Equilibria, particularly the visitor dialog) and other Lesswrong people can probably cover it better. I think that planecrash, Yud’s second HPMOR-scale work, mentions that everyone in academia just generally avoids citing things published outside of academia, because they risk losing status if they do.
EDIT: I went and found that section, it is here:
It’s glowfic, so of course I don’t know how accurate it is as it’s intended to plausibly deniable enough to facilitate free writing (while keeping things entertaining enough to register as not-being-work).
I have to think more about the status dynamics that Eliezer talked about. There’s probably something to it… But this part stands out as wrong or at least needing nuance/explanation:
I think most academic venues do blind reviews and whoever decides whether or not to accept a paper isn’t supposed to know who wrote it? Which isn’t to say that the info won’t leak out anyway and influence the decision. (For example I once left out the acknowledgements section in a paper submission, thinking that, like the author byline, I was supposed to add it after the paper was accepted, but apparently I was actually supposed to include it and someone got really peeved that I didn’t.)
Also it seems weird that Eliezer wrote this in 2021, after this happened in 2019:
From talking with people who do work on a lot of grant committees in the NIH and similar funding orgs, it’s really hard to do proper blinding of reviews. Certain labs tend to focus on particular theories and methods, repeating variations of the same idea… So if you are familiar the general approach of a particular lab and it’s primary investigator, you will immediately recognize and have a knee-jerk reaction (positive or negative) to a paper which pattern-matches to the work that that lab / subfield is doing.
Common reactions from grant reviewers:
Positive—“This fits in nicely with my friend Bob’s work. I respect his work, I should argue for funding this grant.”
Neutral—“This seems entirely novel to me, I don’t recognize it as connecting with any of the leading trendy ideas in the field or any of my personal favorite subtopics. Therefore, this seems high risk and I shouldn’t argue too hard for it.”
Slightly negative—“This seems novel to me, and doesn’t sound particularly ‘jargon-y’ or technically sophisticated. Even if the results would be beneficial to humanity, the methods seem boring and uncreative. I will argue slightly against funding this.”
Negative—“This seems to pattern match to a subfield I feel biased against. Even if this isn’t from one of Jill’s students, it fits with Jill’s take on this subtopic. I don’t want views like Jill’s gaining more traction. I will argue against this regardless of the quality of the logic and preliminary data presented in this grant proposal.”
Ah, sorry that this wasn’t very helpful.
I will self-downvote so this isn’t the top comment. Yud’s stuff is neat, but I haven’t read much on the topic, and passing some along when it comes up has been a good general heuristic.
No need to be sorry, it’s actually great food for thought and I’m glad you pointed me to it.