The discussion around It’s Not the Incentives, It’s You, was pretty gnarly. I think at the time there were some concrete, simple mistakes I was making. I also think there were 4-6 major cruxes of disagreement between me and some other LessWrongers. The 2019 Review seemed like a good time to take stock of that.
I’ve spent around 12 hours talking with a couple people who thought I was mistaken and/or harmful last time, and then 5-10 writing this up. And I don’t feel anywhere near done, but I’m reaching the end of the timebox so here goes.
Core Claims
I think this post and the surrounding commentary (at least on the “pro” side) was making approximately these claims:
A. You are obligated to buck incentives.You might be tempted sometimes to blame The Incentives rather than take personal responsibility for various failures of virtue (epistemic or otherwise). You should take responsibility.
B. Academia has gotten more dishonest, and academics are (wrongly) blaming “The Incentives” instead of taking responsibility.
C. Epistemics are the most important thing. Epistemic Integrity is the most important virtue. Improving societal epistemics is the top cause area.
Possible stronger claim: Lying or manipulating data on the public record is (sometimes? often?) worse than state-sanctioned killing.
D. Special Academic Responsibility: Scientists/Academics have a special responsibility to never be dishonest about their work, and/or to be proactively extremely honest, due to the nature of their profession, independent of the current standards of their profession. If they can’t, they should leave.
E. The previous points are not just correct, but obviously correct.
This is just a short summary, and the claims are necessarily simplified. I think if you add lots of hedging words like “often”, and “mostly”, some of the claims become less controversial.
But I disagreed a lot with some people on how much to weight them. It seemed to me they were leaning a lot more towards absolutism, in a domain that seemed to me to be legitimately murky and confusing.
–
Quick tl;dr on my takes on those claims:
Re: Bucking Incentives:You should buck incentives at least some of the time, and you should allocate a lot of attention to “notice when incentives are pressuring you, and think about what you really value.”
I’m not sure about “exactly how much” and “when?”.
Re: Academia is more dishonest: I’m mostly agnostic on this. Important if true. Rohin argues with this claim, and I think people who want to debate it should respond to him.
Re: Epistemics are Most Important Thing: The previous discussion actively changed my mind on this. Previously I believed “epistemics are maybe in the top 5 causes”. I now think that Epistemics are plausibly the very top cause and most important virtue. At the very least, they are much more important than they seemed to me at the time. (Note that I still maintain that there are degrees of freedom in “how exactly do you accomplish good epistemics”, which truthseekers can disagree on)
I am still somewhat confused about “lying on public record vs state sanctioned killing”. I now think it is plausible. But it is a pretty intense claim (which I think most society disagrees with). I am still mulling over, and am unclear on some details and not even sure what the people I’ve argued with believe about it.
RE: Special Academic Responsibility:IMO this depends a lot on the current state of academia, and what its onboarding culture is like.
RE: “This is all obvious”: This is importantly false, and this is what I was mostly arguing against last year, and still argue against now.
–
I have thoughts on individual pieces of this, which I’ll write as followup comments as I find time.
The forceful, moralizing tone of the article was helpful for me to internalize that I need the skill of noticing, and then bucking, incentives.
Just a few days ago, on Dec 31st, I found myself trying to rush an important blogpost out before 2020 ended, so it could show up next year in the 2020 LW Review. I found myself writing to some people, tongue-in-cheekly saying “Hey guys, um, the incentives say I should try to publish this today. Can you give feedback on it, and/or tell me that it’s not ready and I should take more time?”
And… well, sure I can hide behind the tongue-in-cheekness. And, “Can you help review a blogpost?” is a totally reasonable thing to ask my friends to do.
But, also, as I clicked ‘send’ on the email, I felt a little squirming in my heart. Because I knew damn well the post wasn’t ready. I was just having trouble admitting it to myself because I’d be sad if it were delayed a year from getting into the next set of LW Books. And this was a domain where I literally invented the incentives I was responding to.
It was definitely not the Incentives, It Was Me.
I still totally should have asked my friends for help here. But I knew the answer to my primary question “is it shippable today?”. So I didn’t need to impose any urgency on their help.
This is sticking out in my mind, not because the local instance was very important, but because the moral muscle of noticing a principle in the moment, and applying it, is pretty important. Someday there will be a higher stakes thing where this matters more, and I was disappointed in myself for not getting the answer right in the low-stakes case.
I was able to notice at all, just a little too late, in large part due to this post. I hope to do better next time.
Cognitive processes vs right answers; Median vs top thinkers
My frame here is “what cognitive strategy is useful for the median person to find the right answers”.
I think that people I’ve argued against here were focused more directly on “What are the right answers?” or “What should truthseekers with high standards and philosophical sophistication do?”.
I expect there to be a significant difference between the median academic and the sort of person participating in this conversation.
I think the median academic is running on social cognition, which is very weak. Fixing that should be their top priority. I think fixing that is cognitively very different from “not being academically dishonest.” (Though this may depend somewhat on what sort of academic dishonesty we’re talking about, and how prevalent it is)
I think the people I’ve argued with probably disagree about that, and maybe think that ‘be aligned with the truth’ is a central cognitive strategy that is entangled across the board. This seems false to me for most people, although I can imagine changing my mind.
Arranging coordinated-efforts-that-work (i.e. Stag Hunts) is the most important thing, most other things are distracting and mostly not-the-point
Another central disagreement seemed to have something to do with “there are deontological or virtue-ethics norms you should be following here, about not lying, etc”.
I think it is important to follow your society’s existing norms. But when it comes to trying to improve society’s status quo, virtues and rules are much less important than the virtue of “figure out how to actually coordinate on changing things, and then do that.”
Related to the “cognitive process” point, I think people who get focused on following the exact virtues/rules mostly waste a lot of time on unimportant virtues/rules. The exceptions are when those virtues/rules happen to be particularly important, or bootstrap into stag hunts. But this requires moral luck.
My family cares a lot about recycling and buying local. A lot of the arguments I had heard about this post seemed more like the sort of cognitive algorithm that outputs ‘recycle and buy local’ than ‘Be Richard Feynman or Eliezer Yudkowsky’, when implemented on the average person.
To the extent that your summary of the “pro” case is accurate, particularly “Epistemics are the most important thing”, I find it deeply ironic and sad that all of the commentary, besides one comment from Carl Shulman (and my own), seems to be about what people should do, rather than what is actually true. One would hope that people pushing “epistemics are the most important thing” would want to rely on true facts when pushing their argument.
There are a few more threads I ideally want to write here about what I think was going on in here. I’m not 100% sure whether I endorse your implied argument but think there was something to unravel here in the space you’re pointing at.
The discussion around It’s Not the Incentives, It’s You, was pretty gnarly. I think at the time there were some concrete, simple mistakes I was making. I also think there were 4-6 major cruxes of disagreement between me and some other LessWrongers. The 2019 Review seemed like a good time to take stock of that.
I’ve spent around 12 hours talking with a couple people who thought I was mistaken and/or harmful last time, and then 5-10 writing this up. And I don’t feel anywhere near done, but I’m reaching the end of the timebox so here goes.
Core Claims
I think this post and the surrounding commentary (at least on the “pro” side) was making approximately these claims:
This is just a short summary, and the claims are necessarily simplified. I think if you add lots of hedging words like “often”, and “mostly”, some of the claims become less controversial.
But I disagreed a lot with some people on how much to weight them. It seemed to me they were leaning a lot more towards absolutism, in a domain that seemed to me to be legitimately murky and confusing.
–
Quick tl;dr on my takes on those claims:
Re: Bucking Incentives: You should buck incentives at least some of the time, and you should allocate a lot of attention to “notice when incentives are pressuring you, and think about what you really value.”
I’m not sure about “exactly how much” and “when?”.
Re: Academia is more dishonest: I’m mostly agnostic on this. Important if true. Rohin argues with this claim, and I think people who want to debate it should respond to him.
Re: Epistemics are Most Important Thing: The previous discussion actively changed my mind on this. Previously I believed “epistemics are maybe in the top 5 causes”. I now think that Epistemics are plausibly the very top cause and most important virtue. At the very least, they are much more important than they seemed to me at the time. (Note that I still maintain that there are degrees of freedom in “how exactly do you accomplish good epistemics”, which truthseekers can disagree on)
I am still somewhat confused about “lying on public record vs state sanctioned killing”. I now think it is plausible. But it is a pretty intense claim (which I think most society disagrees with). I am still mulling over, and am unclear on some details and not even sure what the people I’ve argued with believe about it.
RE: Special Academic Responsibility: IMO this depends a lot on the current state of academia, and what its onboarding culture is like.
RE: “This is all obvious”: This is importantly false, and this is what I was mostly arguing against last year, and still argue against now.
–
I have thoughts on individual pieces of this, which I’ll write as followup comments as I find time.
Personal Anecdote:
“It wasn’t the Incentives. It was me.”
The forceful, moralizing tone of the article was helpful for me to internalize that I need the skill of noticing, and then bucking, incentives.
Just a few days ago, on Dec 31st, I found myself trying to rush an important blogpost out before 2020 ended, so it could show up next year in the 2020 LW Review. I found myself writing to some people, tongue-in-cheekly saying “Hey guys, um, the incentives say I should try to publish this today. Can you give feedback on it, and/or tell me that it’s not ready and I should take more time?”
And… well, sure I can hide behind the tongue-in-cheekness. And, “Can you help review a blogpost?” is a totally reasonable thing to ask my friends to do.
But, also, as I clicked ‘send’ on the email, I felt a little squirming in my heart. Because I knew damn well the post wasn’t ready. I was just having trouble admitting it to myself because I’d be sad if it were delayed a year from getting into the next set of LW Books. And this was a domain where I literally invented the incentives I was responding to.
It was definitely not the Incentives, It Was Me.
I still totally should have asked my friends for help here. But I knew the answer to my primary question “is it shippable today?”. So I didn’t need to impose any urgency on their help.
This is sticking out in my mind, not because the local instance was very important, but because the moral muscle of noticing a principle in the moment, and applying it, is pretty important. Someday there will be a higher stakes thing where this matters more, and I was disappointed in myself for not getting the answer right in the low-stakes case.
I was able to notice at all, just a little too late, in large part due to this post. I hope to do better next time.
Framing disagreements
Cognitive processes vs right answers; Median vs top thinkers
My frame here is “what cognitive strategy is useful for the median person to find the right answers”.
I think that people I’ve argued against here were focused more directly on “What are the right answers?” or “What should truthseekers with high standards and philosophical sophistication do?”.
I expect there to be a significant difference between the median academic and the sort of person participating in this conversation.
I think the median academic is running on social cognition, which is very weak. Fixing that should be their top priority. I think fixing that is cognitively very different from “not being academically dishonest.” (Though this may depend somewhat on what sort of academic dishonesty we’re talking about, and how prevalent it is)
I think the people I’ve argued with probably disagree about that, and maybe think that ‘be aligned with the truth’ is a central cognitive strategy that is entangled across the board. This seems false to me for most people, although I can imagine changing my mind.
Arranging coordinated-efforts-that-work (i.e. Stag Hunts) is the most important thing, most other things are distracting and mostly not-the-point
Another central disagreement seemed to have something to do with “there are deontological or virtue-ethics norms you should be following here, about not lying, etc”.
I think it is important to follow your society’s existing norms. But when it comes to trying to improve society’s status quo, virtues and rules are much less important than the virtue of “figure out how to actually coordinate on changing things, and then do that.”
Related to the “cognitive process” point, I think people who get focused on following the exact virtues/rules mostly waste a lot of time on unimportant virtues/rules. The exceptions are when those virtues/rules happen to be particularly important, or bootstrap into stag hunts. But this requires moral luck.
My family cares a lot about recycling and buying local. A lot of the arguments I had heard about this post seemed more like the sort of cognitive algorithm that outputs ‘recycle and buy local’ than ‘Be Richard Feynman or Eliezer Yudkowsky’, when implemented on the average person.
To the extent that your summary of the “pro” case is accurate, particularly “Epistemics are the most important thing”, I find it deeply ironic and sad that all of the commentary, besides one comment from Carl Shulman (and my own), seems to be about what people should do, rather than what is actually true. One would hope that people pushing “epistemics are the most important thing” would want to rely on true facts when pushing their argument.
There are a few more threads I ideally want to write here about what I think was going on in here. I’m not 100% sure whether I endorse your implied argument but think there was something to unravel here in the space you’re pointing at.