This is part of what you mean when you say the report-drafting scientist is “not a bad person”—they’ve followed the letter of the moral law as best they can [...] your judgment (“I guess they’re not a bad person”) is the judgment that morality encourages you to give
So, from my perspective as an author (which, you know, could be wrong), that line was mostly a strategic political concession: there’s this persistent problem where when you try to talk about harms from people being complicit with systems of deception (not even to do anything about it, but just to talk about the problem), the discussion immediatelygets derailed on, “What?! Are you saying I’m a bad person!? How dare you!” … which is a much less interesting topic.
The first line of defense against this kind of derailing is to be very clear about what is being claimed (which is just good intellectual practice that you should be doing anyway): “By systems of deception, I mean processes that systematically result in less accurate beliefs—the English word ‘deception’ is often used with moralizing connotations, but I’m talking about a technical concept that I can implement as literal executable Python programs. Similarly, while I don’t yet have an elegant reduction of the underlying game theory corresponding to the word ‘complicity’ …”
The second line of defense is to throw the potential-derailer a bone in the form of an exculpatory disclaimer: “I’m not trying to blame anyone, I’m just saying that …” Even if (all other things being equal) you would prefer to socially punish complicity with systems of deception, by precomitting to relinquish the option to punish, you can buy a better chance of actually having a real discussion about the problem. (Making the precommitment credible is tough, though.)
Ironically, this is an instance of the same problem it’s trying to combat (“distorting communication to appease authority” and “distorting communication in order to appease people who are afraid you’re trying to scapegoat them on the pretext of them distorting communication to appease authority” are both instances of “distorting communication because The Incentives”), but hopefully a less severe one, whose severity is further reduced byexplaining that I’m doing it in the comments.
You can also think of the “I’m not blaming you, but seriously, this is harmful” maneuver as an interaction between levels: an axiological attempt to push for a higher moral standard in given community, while acknowledging that the community does not yet uphold the higher standard (analogous to moral attempt to institute tougher laws, while acknowledging that the sin in question is not a crime under current law).
noticing small lies committed by accident or under stress.
Lies committed “by accident”? What, like unconsciously? (Maybe the part of your brain that generated this sentence doesn’t disagree with Jessica about the meaning of the word lie as much as the part of your brain that argues about intensional definitions??)
(Thanks for your patience.)
So, from my perspective as an author (which, you know, could be wrong), that line was mostly a strategic political concession: there’s this persistent problem where when you try to talk about harms from people being complicit with systems of deception (not even to do anything about it, but just to talk about the problem), the discussion immediately gets derailed on, “What?! Are you saying I’m a bad person!? How dare you!” … which is a much less interesting topic.
The first line of defense against this kind of derailing is to be very clear about what is being claimed (which is just good intellectual practice that you should be doing anyway): “By systems of deception, I mean processes that systematically result in less accurate beliefs—the English word ‘deception’ is often used with moralizing connotations, but I’m talking about a technical concept that I can implement as literal executable Python programs. Similarly, while I don’t yet have an elegant reduction of the underlying game theory corresponding to the word ‘complicity’ …”
The second line of defense is to throw the potential-derailer a bone in the form of an exculpatory disclaimer: “I’m not trying to blame anyone, I’m just saying that …” Even if (all other things being equal) you would prefer to socially punish complicity with systems of deception, by precomitting to relinquish the option to punish, you can buy a better chance of actually having a real discussion about the problem. (Making the precommitment credible is tough, though.)
Ironically, this is an instance of the same problem it’s trying to combat (“distorting communication to appease authority” and “distorting communication in order to appease people who are afraid you’re trying to scapegoat them on the pretext of them distorting communication to appease authority” are both instances of “distorting communication because The Incentives”), but hopefully a less severe one, whose severity is further reduced by explaining that I’m doing it in the comments.
You can also think of the “I’m not blaming you, but seriously, this is harmful” maneuver as an interaction between levels: an axiological attempt to push for a higher moral standard in given community, while acknowledging that the community does not yet uphold the higher standard (analogous to moral attempt to institute tougher laws, while acknowledging that the sin in question is not a crime under current law).
Lies committed “by accident”? What, like unconsciously? (Maybe the part of your brain that generated this sentence doesn’t disagree with Jessica about the meaning of the word lie as much as the part of your brain that argues about intensional definitions??)