One point is that I feel very unconfused. That is, not only do I not feel confused now, I once felt confused and experienced what I thought was confusion lifting and being replaced by understanding.
I feel like it’s useful that a part of me always kicks in
Which one, if just one, criteria for usefulness are you using here? It is useful for the human to have pain receptors, but there is negative utility in being vulnerable to torture (and not just from one’s personal perspective).
Surely you don’t expect that even the most useful intuition is always right? This is similar to the Bin Laden point above, that the most justified and net-good action will almost certainly have negative consequences.
I’m willing to call your intuition useful if it often saves you from being misled, and its score on any particular case is not too important in its overall value.
However, its score on any particular case is indicative of how it would do in similar cases. If it has a short track record and it fails this test, we have excellent reason to believe it is a poorly tuned intuition because we know little other than how it did on this hypothetical, though its poor performance on this hypothetical should never be considered a significant factor in what makes it generally out of step with moral dilemmas regardless. This is analogous to getting cable ratings from only a few tracked boxes: we think many millions watched a show because many of the thousands tracked did, but do not think those thousand constitute a substantial portion of the audience.
Which one, if just one, criteria for usefulness are you using here? It is useful for the human to have pain receptors, but there is negative utility in being vulnerable to torture (and not just from one’s personal perspective).
That’s the one I’m referencing. My fear of having been terribly immoral (which could also be even less virtuously characterized as being or at least being motivated by an unreasonable fear of negative social feedback) is useful because it increases the extent to which I’m reflective on my decisions and practical moral positions, especially in situations that pattern match to ones that I’ve already implicitly labeled as ‘situations where it would be easy to deceive myself into thinking I had a good justification when I didn’t’, or ‘situations where it would be easy to throw up my hands because it’s not like anyone could actually expect me to be perfect’. Vegetarianism is a concrete example. The alarm itself (though perhaps not the state of mind that summons it) has been practically useful in the past, even just from a hedonic perspective.
OK, sometimes you will end up making the same decision after reflection and having wasted time, other times you may even change from a good decision (by all relevant criteria) to a bad one simply because your self-reflection was poorly executed. That doesn’t necessarily mean there’s something wrong with you for having a fear or with your fear (though it seems too strong in my opinion).
This should be obvious—it wasn’t to me until after reading your comment the second time—but “increases the extent to which I’m reflective” really ought to sound extraordinarily uncompelling to us. Think about it: a bias increases the extent to which you do something. It should be obvious that that thing is not always good to increase, and the only reason it seems otherwise to us is that we automatically assume there are biases in the opposite direction that won’t be exceeded however much we try to bias ourselves. Even so, to combat bias with bias—it’s not ideal.
One point is that I feel very unconfused. That is, not only do I not feel confused now, I once felt confused and experienced what I thought was confusion lifting and being replaced by understanding.
Which one, if just one, criteria for usefulness are you using here? It is useful for the human to have pain receptors, but there is negative utility in being vulnerable to torture (and not just from one’s personal perspective).
Surely you don’t expect that even the most useful intuition is always right? This is similar to the Bin Laden point above, that the most justified and net-good action will almost certainly have negative consequences.
I’m willing to call your intuition useful if it often saves you from being misled, and its score on any particular case is not too important in its overall value.
However, its score on any particular case is indicative of how it would do in similar cases. If it has a short track record and it fails this test, we have excellent reason to believe it is a poorly tuned intuition because we know little other than how it did on this hypothetical, though its poor performance on this hypothetical should never be considered a significant factor in what makes it generally out of step with moral dilemmas regardless. This is analogous to getting cable ratings from only a few tracked boxes: we think many millions watched a show because many of the thousands tracked did, but do not think those thousand constitute a substantial portion of the audience.
That’s the one I’m referencing. My fear of having been terribly immoral (which could also be even less virtuously characterized as being or at least being motivated by an unreasonable fear of negative social feedback) is useful because it increases the extent to which I’m reflective on my decisions and practical moral positions, especially in situations that pattern match to ones that I’ve already implicitly labeled as ‘situations where it would be easy to deceive myself into thinking I had a good justification when I didn’t’, or ‘situations where it would be easy to throw up my hands because it’s not like anyone could actually expect me to be perfect’. Vegetarianism is a concrete example. The alarm itself (though perhaps not the state of mind that summons it) has been practically useful in the past, even just from a hedonic perspective.
OK, sometimes you will end up making the same decision after reflection and having wasted time, other times you may even change from a good decision (by all relevant criteria) to a bad one simply because your self-reflection was poorly executed. That doesn’t necessarily mean there’s something wrong with you for having a fear or with your fear (though it seems too strong in my opinion).
This should be obvious—it wasn’t to me until after reading your comment the second time—but “increases the extent to which I’m reflective” really ought to sound extraordinarily uncompelling to us. Think about it: a bias increases the extent to which you do something. It should be obvious that that thing is not always good to increase, and the only reason it seems otherwise to us is that we automatically assume there are biases in the opposite direction that won’t be exceeded however much we try to bias ourselves. Even so, to combat bias with bias—it’s not ideal.