If we’re counting guilt as suffering in an ethically consequential sense—which seems reasonable, since it’s pretty profoundly unpleasant and there’s a pretty clear functional analogy to physical pain—and if that suffering is additive with other kinds, then consequentialists should want people to feel guilt when they do bad things if and only if that guilt eliminates more suffering (of any type) down the road. Don’t know if you’re a consequentialist, but this seems like a good starting point.
In any case, that condition seems like it’s sometimes but not always true. Guilt over immutable or nearly immutable urges seems like a net loss unless those urges are both proportionally destructive and susceptible to conditioned reduction in the average case. Guilt strong enough to be unpleasant but weak enough not to overcome whatever other factors are making people do bad shit is likewise a loss. Interestingly, this seems to indicate that consequentialists should sometimes prefer intense over moderate guilt, unless it’s gratuitously intense relative to what’s needed to stop the behavior: sufficiently disproportionate guilt is also a loss.
The obvious objection to this line of thinking is that certain categories of socially constructed bad shit—not to name names—might stick around if and only if they stay at or above a certain level of prevalence in the population, sort of a memetic equivalent of herd immunity. Since these patterns can persist for an unbounded length of time and cause suffering as long as they do, anything capable of incrementally degrading them could have second-order consequences much larger than its first-order effects, potentially enough to justify any and all related guilt. In this case uncertainties about the problem structure seem to dominate consequential reasoning, much as per Pascal’s Mugging.
Guilt over immutable or nearly immutable urges seems like a net loss unless those urges are both proportionally destructive and susceptible to conditioned reduction in the average case.
In my experience, feelings of guilt coupled with the attitude that it is “immutable”, can be an effective excuse not to fix harmful behavior. It’s a sort of ugh field. When the consequences of the behavior become sufficiently intolerable, one is eventually tempted to hang the guilt and test that supposed immutability.
Sure, that’s a failure mode, and it’s one which—stepping down a level of abstraction—seems prevalent in gender discussions (“I’m $gender, I can’t help it!”). From the inside, it can be pretty hard to distinguish between the motivations you can and can’t change with enough reflection. There’s a loose cultural consensus as to what counts, but at the same time that varies between subcultures and can lead to conflict in its own right: consider the “ex-gay” phenomenon in fundamentalist Christian spheres.
Maybe I shouldn’t have mentioned it in context; in my estimation it’s not directly relevant to what we’re discussing upthread. But at the same time I think it’s a mistake to consider our wants entirely plastic; for the time being we’re working with a certain set of hardware, and software changes can only do so much.
If we’re counting guilt as suffering in an ethically consequential sense—which seems reasonable, since it’s pretty profoundly unpleasant and there’s a pretty clear functional analogy to physical pain—and if that suffering is additive with other kinds, then consequentialists should want people to feel guilt when they do bad things if and only if that guilt eliminates more suffering (of any type) down the road. Don’t know if you’re a consequentialist, but this seems like a good starting point.
In any case, that condition seems like it’s sometimes but not always true. Guilt over immutable or nearly immutable urges seems like a net loss unless those urges are both proportionally destructive and susceptible to conditioned reduction in the average case. Guilt strong enough to be unpleasant but weak enough not to overcome whatever other factors are making people do bad shit is likewise a loss. Interestingly, this seems to indicate that consequentialists should sometimes prefer intense over moderate guilt, unless it’s gratuitously intense relative to what’s needed to stop the behavior: sufficiently disproportionate guilt is also a loss.
The obvious objection to this line of thinking is that certain categories of socially constructed bad shit—not to name names—might stick around if and only if they stay at or above a certain level of prevalence in the population, sort of a memetic equivalent of herd immunity. Since these patterns can persist for an unbounded length of time and cause suffering as long as they do, anything capable of incrementally degrading them could have second-order consequences much larger than its first-order effects, potentially enough to justify any and all related guilt. In this case uncertainties about the problem structure seem to dominate consequential reasoning, much as per Pascal’s Mugging.
In my experience, feelings of guilt coupled with the attitude that it is “immutable”, can be an effective excuse not to fix harmful behavior. It’s a sort of ugh field. When the consequences of the behavior become sufficiently intolerable, one is eventually tempted to hang the guilt and test that supposed immutability.
Sure, that’s a failure mode, and it’s one which—stepping down a level of abstraction—seems prevalent in gender discussions (“I’m $gender, I can’t help it!”). From the inside, it can be pretty hard to distinguish between the motivations you can and can’t change with enough reflection. There’s a loose cultural consensus as to what counts, but at the same time that varies between subcultures and can lead to conflict in its own right: consider the “ex-gay” phenomenon in fundamentalist Christian spheres.
Maybe I shouldn’t have mentioned it in context; in my estimation it’s not directly relevant to what we’re discussing upthread. But at the same time I think it’s a mistake to consider our wants entirely plastic; for the time being we’re working with a certain set of hardware, and software changes can only do so much.