Your meaning of “attitude” seems to amount to sloppy reasoning, where one endorses entertaining an unclear thought while refusing to unpack and sharpen its meaning (or alternatively to discard it as meaningless cognitive noise). Disapproving of “attitude” in this sense can then be a moral or aesthetic or instrumental judgement (as in, “it is wrong for a human being to reduce their clarity of thought”; or “it is disgusting when one engages in avoidable sloppy cognition”; or “it is disadvantageous to compromise one’s thinking skills by not exercising them in some situations”). Such judgments are not examples of “attitude”, as they can be unpacked and clarified as needed.
Your meaning of “attitude” seems to amount to sloppy reasoning, where one endorses entertaining an unclear thought while refusing to unpack and sharpen its meaning (or alternatively to discard it as meaningless cognitive noise).
Brains don’t just reason. They also make perceptual judgements, feel emotions, and feel very strongly about statements that aren’t falsifiable empirical claims.
You say “unpack and sharpen”, I hear “rationalize”. If you’re inventing the explanation after the fact of experiencing the mental activity, is the mental activity really best understood in terms of the explanation?
The point is not to invent an explanation, but to only consider explained and meaningful those things for which you understand the explanation and meaning. If no explanation is available, don’t act as if you have one, don’t trust your brain to be thinking sense when you don’t know what it’s thinking and why.
If no explanation is available, don’t act as if you have one, don’t trust your brain to be thinking sense when you don’t know what it’s thinking and why.
This doesn’t sound like a sanitary or reliable heuristic. I don’t have any explanation that I deeply understand for certain things that my brain do, but when I ignore my brain on those things I invariably end up in a horrible situation that is much worse than if I’d just listened to it. I didn’t even have any clear explanation for why my brain would go “AAAAH TIGER RUN!” until I read about ev-psych, but I’m quite confident that not trusting it in such situations would be a very bad move.
In the context of this discussion, there is enough time to think things over. I primarily object to letting your brain systematically and repeatedly engage in activities of unclear purpose and meaning, without stopping to reflect on what it’s doing and why, and stopping to do that if the activity appears to be pointless.
What I was attempting to say is that even under those circumstances, there are specific contexts in which I’m consciously unclear as to what I’m doing, or why my brain wants to do this, and it seems pointless after a cursory analysis, but that in those specific contexts for specific types of activities this exact pattern has repeatedly shown itself to produce reliably better results than whatever I would decide to do consciously about those things.
These are not restricted to time-constrained scenarios of pressing urgency.
However, it might not be widely applicable to just anyone in general, since it obviously depends on some subconscious knowledge of these particular activities and a ton of background requirements and given assumptions.
The gist is: There are specific cases where I noticed a pattern that my brain does things which are unclear to me, but where if I act on them I obtain reliably better results than if I do not for certain contrived edge cases. For cases that do not pattern-match to known reliable results, I prefer to think things through as recommended (or sometimes experiment if the VoI is probably larger than the higher expected cost).
The gist is: There are specific cases where I noticed a pattern that my brain does things which are unclear to me, but where if I act on them I obtain reliably better results
This kind of experimental evaluation seems like an all right method of judging your brain, if performed correctly. What I’m not comfortable with is endorsement of the absence of judgement over one’s cognition or of not changing anything based on such judgment, no matter what situations that endorsement is restricted to.
Hmm. Well, true for me too. I wouldn’t endorse it per-se either, especially not in an ideal world with an ideal mind.
However, considering limited mental resources, limited willpower and constant internal competition for the conscious mind’s attentions, I believe that this kind of behavior is instrumentally rational considering that it works when you have a good idea of when automatic behavior produces better results and, more importantly, all the much more likely times where it doesn’t.
Your meaning of “attitude” seems to amount to sloppy reasoning, where one endorses entertaining an unclear thought while refusing to unpack and sharpen its meaning (or alternatively to discard it as meaningless cognitive noise). Disapproving of “attitude” in this sense can then be a moral or aesthetic or instrumental judgement (as in, “it is wrong for a human being to reduce their clarity of thought”; or “it is disgusting when one engages in avoidable sloppy cognition”; or “it is disadvantageous to compromise one’s thinking skills by not exercising them in some situations”). Such judgments are not examples of “attitude”, as they can be unpacked and clarified as needed.
Brains don’t just reason. They also make perceptual judgements, feel emotions, and feel very strongly about statements that aren’t falsifiable empirical claims.
You say “unpack and sharpen”, I hear “rationalize”. If you’re inventing the explanation after the fact of experiencing the mental activity, is the mental activity really best understood in terms of the explanation?
The point is not to invent an explanation, but to only consider explained and meaningful those things for which you understand the explanation and meaning. If no explanation is available, don’t act as if you have one, don’t trust your brain to be thinking sense when you don’t know what it’s thinking and why.
This doesn’t sound like a sanitary or reliable heuristic. I don’t have any explanation that I deeply understand for certain things that my brain do, but when I ignore my brain on those things I invariably end up in a horrible situation that is much worse than if I’d just listened to it. I didn’t even have any clear explanation for why my brain would go “AAAAH TIGER RUN!” until I read about ev-psych, but I’m quite confident that not trusting it in such situations would be a very bad move.
In the context of this discussion, there is enough time to think things over. I primarily object to letting your brain systematically and repeatedly engage in activities of unclear purpose and meaning, without stopping to reflect on what it’s doing and why, and stopping to do that if the activity appears to be pointless.
What I was attempting to say is that even under those circumstances, there are specific contexts in which I’m consciously unclear as to what I’m doing, or why my brain wants to do this, and it seems pointless after a cursory analysis, but that in those specific contexts for specific types of activities this exact pattern has repeatedly shown itself to produce reliably better results than whatever I would decide to do consciously about those things.
These are not restricted to time-constrained scenarios of pressing urgency.
However, it might not be widely applicable to just anyone in general, since it obviously depends on some subconscious knowledge of these particular activities and a ton of background requirements and given assumptions.
The gist is: There are specific cases where I noticed a pattern that my brain does things which are unclear to me, but where if I act on them I obtain reliably better results than if I do not for certain contrived edge cases. For cases that do not pattern-match to known reliable results, I prefer to think things through as recommended (or sometimes experiment if the VoI is probably larger than the higher expected cost).
This kind of experimental evaluation seems like an all right method of judging your brain, if performed correctly. What I’m not comfortable with is endorsement of the absence of judgement over one’s cognition or of not changing anything based on such judgment, no matter what situations that endorsement is restricted to.
Hmm. Well, true for me too. I wouldn’t endorse it per-se either, especially not in an ideal world with an ideal mind.
However, considering limited mental resources, limited willpower and constant internal competition for the conscious mind’s attentions, I believe that this kind of behavior is instrumentally rational considering that it works when you have a good idea of when automatic behavior produces better results and, more importantly, all the much more likely times where it doesn’t.