I would suggest that concern over the ‘sacred’ is just one manifestation of a misplaced overconcern with emotion and sensation which is antithetical to rationality.
This is another example of the point I made a few weeks ago, about confusing emotion with irrationality.
Emotion and sensation are the basic foundations of all thought. Without them we would not be able to think. Rational and irrational describe conscious thought using, among other things, our emotions and especially our sensations as their anchors to reality.
I am still not ready to respond at greater length; I am working on a short essay to assemble my thoughts. You might try Jonathan Barron’s “Thinking and Deciding”; Chapter 3 in the second edition has something more about this.
Emotion and rational thought are antithetical. This is both obvious from everyday experience and from even a cursory study of the neurological evidence.
I don’t know precisely why you people keep insisting you can be rational while permitting yourself strong emotions, but I can make some pretty good guesses.
If you take a person and remove their emotion, you don’t get Spock off of Star-Trek, you get someone completely unable to make decisions in their life, someone who can think of no rational reason to chose one flavour of crisps over another and dithers for hours.
Inappropriate emotion can certainly cloud judgement, but removing the emotions all together won’t help make you more rational I don’t think.
That isn’t what I’m talking about… and quite frankly, I doubt that such individuals represent valid examples of emotionlessness without additional damage. If you don’t care which flavor you get but you want the chips, you simply pick one at random. Dithering for hours over an irrelevant detail is neither rational nor intelligent.
Such people are more accurately described by saying that their pondering-resolution systems are defunct.
Rational individuals are those who are able to screen out the influence of their emotions from their reasoning. There is ample neurological data supporting this claim.
But there IS a flavour that you’d enjoy most, it’s just that without projecting yourself into that future position and imagining the emotional content of it you can’t decide which will be best.
Intellectually, Elliott is unimpaired. IQ and memory tests reveal nothing abnormal
. . .
Eventually the researchers trace this myopic indecisiveness to a curious absence of feeling, itself the result of damage to the frontal part of the brain’s cortex. Elliott “knows” but cannot “feel”. When confronted with pictures of people injured in gory accidents, he knows intellectually that he should feel distressedbut he doesn’t actually feel distressed
. . .
Without these emotional changes to guide his thought processes, concludes Damasio, life for Elliott is a hell of indecision. Yes, he can mull over every option ad infinitum; but when it comes to experiencing the subtle internal values and biases of feeling necessary for actually choosing between the options, “gut feelings” or “instincts” are just plain missing. Elliott, in Damasio’s own words, is “irrational concerning the larger framework of behaviour”.
Too much emotion certainly can make us irrational, but so can too little. You need to know how things will make you feel, in order to rationally chose between alternatives.
If I can’t decide which will be best, I’ll just choose one.
Elliot seems to have problems valuing things—not surprising, since the frontal lobes make it possible to associate abstract ideas and the valences of preference, among other things.
It seem to me that he would have made a decision based on his feelings, and how that his feelings can no longer be associated with states, the decision process no longer terminates.
Think rather of people with “flattened affect”. That’s what we should be aiming for. Think Mr. Data.
Yeah, see, figuring things out from first principles, rigorously applying your values, calculating the best option given a multi-dimensional array of preferences in various categories and doing a weighted sum on them to determine an appropriate course is a good thing. People should definitely know how to do that. I’m glad I have whatever basic grasp on the functions involved that I do have.
But it’s not actually how human beings think.
Even determining what heuristic you’d use to judge a situation’s utility on that multidimensional scale would be a monumental undertaking. It’d take an age.
What actual human beings do is let their subconscious brain do all that tricky heuristic-based summing and weighting and determining which aspects are important and then signal it up to consciousness via a wooly, fuzzy, sometimes vague, often powerful, emotional response.
Data wanted to be human, but he can never be human coz his physiology just wasn’t wired that way.
You may determine that you want to be an android, dispassionately calculating every move from first principles and values. But you just ain’t wired that way. You won’t have time to run a wetware program to emulate it. Not one that doesn’t take advantage of your emotions anyway.
God knows how an emotionless person would fair trying to predict or influence another human being’s actions.
Better to learn to hear your emotions, to understand the message they’re giving you, learn to fine-tune them if they’re lying or wrong, than to ignore them.
Emotion, especially strong emotions, will tend to distract and bias your thinking. This is “obvious from everyday experience” which is why you don’t try to do any difficult thinking while in the grip of strong emotions, at least not if you’re rational. But this does not make them “antithetical”.
Harry Browne, in “How I Found Freedom in an Unfree World”, suggested the purpose of developing a personal morality is to have rules to guide your actions when you are too emotionally engaged to think rationally. You think out the appropriate responses to various situations and problems rationally, then use these responses as rules to guide your behavior when you don’t have time or are too distracted to think rationally.
I’m rather conflicted in my response to this… ADBOC, I suppose. (Agree denotatively but object connotatively.)
On the one hand, I agree with you that emotion distorts reasoning—especially negative emotion. However, it’s the desire to suppress negative emotions that powers most “motivated reasoning”—i.e., we try to explain away our fears and setbacks.
But this means that pretending to not feel negative emotions, leads to precisely the distortion you seem to be saying you’re concerned about.
In contrast, we have no reason to explain away positive emotions, nor do we generally feel the need to randomly make up explanations to feel good about—if we feel good, we generally just feel it, and are maybe motivated to DO something about it. (We don’t normally sit around reasoning about it, unless we also have some fear about being happy… in which case it’s the fear that motivates the reasoning.)
So while your statement is literally true—negative emotions motivate distorted reasoning, and positive emotions don’t necessarily encourage ANY reasoning… that doesn’t mean that suppressing or ignoring emotions is actually useful!
To engage and eliminate negatively-motivated reasoning, it’s necessary to first face the facts behind the emotion in question. You can’t be an emotional illiterate, and still be rational.
“But this means that pretending to not feel negative emotions, leads to precisely the distortion you seem to be saying you’re concerned about.”
If I may make a suggestion: I highly recommend reading Diane Duane’s “Spock’s World”. There is an extensive discussion of the difference between mastering one’s emotions and merely pretending that they don’t exist.
Emotion can work against, or in conjunction with, rational cognitions, depending on the specific case at hand. For example, hypocrisy tends to anger people, and desire to avoid hypocrisy may lead people to avoid contradiction in their views (I say “may,” because of course not everyone will even scrutinize themselves for hypocrisy, only others).
Reading about something like Lysenkoism makes me mad, but that emotion might actually inspire people to be more rational, rather than less, in considering science.
Emotions are but one of many heuristics in coming up with arguments (i.e. Reichenbach’s context of discovery), which is fine as long as we have a rational justification for that argument (context of justification). If emotions are so strong that they are impinging on the context of justification, not just discovery, then I agree that there would be a problem.
I would suggest that concern over the ‘sacred’ is just one manifestation of a misplaced overconcern with emotion and sensation which is antithetical to rationality.
This is another example of the point I made a few weeks ago, about confusing emotion with irrationality.
Emotion and sensation are the basic foundations of all thought. Without them we would not be able to think. Rational and irrational describe conscious thought using, among other things, our emotions and especially our sensations as their anchors to reality.
I am still not ready to respond at greater length; I am working on a short essay to assemble my thoughts. You might try Jonathan Barron’s “Thinking and Deciding”; Chapter 3 in the second edition has something more about this.
“confusing emotion with irrationality.”
Emotion and rational thought are antithetical. This is both obvious from everyday experience and from even a cursory study of the neurological evidence.
I don’t know precisely why you people keep insisting you can be rational while permitting yourself strong emotions, but I can make some pretty good guesses.
If you take a person and remove their emotion, you don’t get Spock off of Star-Trek, you get someone completely unable to make decisions in their life, someone who can think of no rational reason to chose one flavour of crisps over another and dithers for hours.
Inappropriate emotion can certainly cloud judgement, but removing the emotions all together won’t help make you more rational I don’t think.
“If you take a person and remove their emotion,”
That isn’t what I’m talking about… and quite frankly, I doubt that such individuals represent valid examples of emotionlessness without additional damage. If you don’t care which flavor you get but you want the chips, you simply pick one at random. Dithering for hours over an irrelevant detail is neither rational nor intelligent.
Such people are more accurately described by saying that their pondering-resolution systems are defunct.
Rational individuals are those who are able to screen out the influence of their emotions from their reasoning. There is ample neurological data supporting this claim.
But there IS a flavour that you’d enjoy most, it’s just that without projecting yourself into that future position and imagining the emotional content of it you can’t decide which will be best.
new scientist had an article {subscription needed, mirrored here} which mentions “Elliot”, a patient described in Descarte’s Error:
. . .
. . .
Too much emotion certainly can make us irrational, but so can too little. You need to know how things will make you feel, in order to rationally chose between alternatives.
If I can’t decide which will be best, I’ll just choose one.
Elliot seems to have problems valuing things—not surprising, since the frontal lobes make it possible to associate abstract ideas and the valences of preference, among other things.
It seem to me that he would have made a decision based on his feelings, and how that his feelings can no longer be associated with states, the decision process no longer terminates.
Think rather of people with “flattened affect”. That’s what we should be aiming for. Think Mr. Data.
Yeah, see, figuring things out from first principles, rigorously applying your values, calculating the best option given a multi-dimensional array of preferences in various categories and doing a weighted sum on them to determine an appropriate course is a good thing. People should definitely know how to do that. I’m glad I have whatever basic grasp on the functions involved that I do have.
But it’s not actually how human beings think.
Even determining what heuristic you’d use to judge a situation’s utility on that multidimensional scale would be a monumental undertaking. It’d take an age.
What actual human beings do is let their subconscious brain do all that tricky heuristic-based summing and weighting and determining which aspects are important and then signal it up to consciousness via a wooly, fuzzy, sometimes vague, often powerful, emotional response.
Data wanted to be human, but he can never be human coz his physiology just wasn’t wired that way.
You may determine that you want to be an android, dispassionately calculating every move from first principles and values. But you just ain’t wired that way. You won’t have time to run a wetware program to emulate it. Not one that doesn’t take advantage of your emotions anyway.
God knows how an emotionless person would fair trying to predict or influence another human being’s actions.
Better to learn to hear your emotions, to understand the message they’re giving you, learn to fine-tune them if they’re lying or wrong, than to ignore them.
Emotion, especially strong emotions, will tend to distract and bias your thinking. This is “obvious from everyday experience” which is why you don’t try to do any difficult thinking while in the grip of strong emotions, at least not if you’re rational. But this does not make them “antithetical”.
Harry Browne, in “How I Found Freedom in an Unfree World”, suggested the purpose of developing a personal morality is to have rules to guide your actions when you are too emotionally engaged to think rationally. You think out the appropriate responses to various situations and problems rationally, then use these responses as rules to guide your behavior when you don’t have time or are too distracted to think rationally.
I’m rather conflicted in my response to this… ADBOC, I suppose. (Agree denotatively but object connotatively.)
On the one hand, I agree with you that emotion distorts reasoning—especially negative emotion. However, it’s the desire to suppress negative emotions that powers most “motivated reasoning”—i.e., we try to explain away our fears and setbacks.
But this means that pretending to not feel negative emotions, leads to precisely the distortion you seem to be saying you’re concerned about.
In contrast, we have no reason to explain away positive emotions, nor do we generally feel the need to randomly make up explanations to feel good about—if we feel good, we generally just feel it, and are maybe motivated to DO something about it. (We don’t normally sit around reasoning about it, unless we also have some fear about being happy… in which case it’s the fear that motivates the reasoning.)
So while your statement is literally true—negative emotions motivate distorted reasoning, and positive emotions don’t necessarily encourage ANY reasoning… that doesn’t mean that suppressing or ignoring emotions is actually useful!
To engage and eliminate negatively-motivated reasoning, it’s necessary to first face the facts behind the emotion in question. You can’t be an emotional illiterate, and still be rational.
“But this means that pretending to not feel negative emotions, leads to precisely the distortion you seem to be saying you’re concerned about.”
If I may make a suggestion: I highly recommend reading Diane Duane’s “Spock’s World”. There is an extensive discussion of the difference between mastering one’s emotions and merely pretending that they don’t exist.
Emotion can work against, or in conjunction with, rational cognitions, depending on the specific case at hand. For example, hypocrisy tends to anger people, and desire to avoid hypocrisy may lead people to avoid contradiction in their views (I say “may,” because of course not everyone will even scrutinize themselves for hypocrisy, only others).
Reading about something like Lysenkoism makes me mad, but that emotion might actually inspire people to be more rational, rather than less, in considering science.
Emotions are but one of many heuristics in coming up with arguments (i.e. Reichenbach’s context of discovery), which is fine as long as we have a rational justification for that argument (context of justification). If emotions are so strong that they are impinging on the context of justification, not just discovery, then I agree that there would be a problem.