Good question. I conclude that morality (which, as far as I can tell, seems like the same thing as goodness and altruism) does exist, that our desire to be moral is the result of evolution (thanks for your scientific backup) just as much as our selfish desires are results of evolution. Whatever you call happiness, goodness falls into the same category. I think that some people are mystified when they make decisions that inefficiently optimize their happiness (like all those examples we talked about), but they shouldn’t be. Goodness is a terminal value too.
Also, morality is relative. How moral you are can be measured by some kind of altruism ratio that compares your terminal values of happiness and goodness. Someone can be “more moral” than others in the sense that he would be motivated more by goodness/altruism than he is by his own personal satisfaction, relative to them.
Is there any value in this idea? No practical value, except whatever personal satisfaction value an individual assigns to clarity. I wouldn’t even call the idea a conclusion as much as a way to describe the things I understand in a slightly more clear way. I still don’t particularly like ends-in-themselves.
Reduction time:
Why should I pursue clarity or donate to effective charities that are sub-optimal happiness-maximizers?
Because those are instrumental values.
Why should I pursue these instrumental values?
Because they lead to happiness and goodness.
Why should I pursue happiness and goodness?
Because they’re terminal values.
Why should I pursue these terminal values?
Wrong question. Terminal values, by definition, are ends-in-themselves. So here the real question is not why should I, but rather, why do I pursue them? It’s because the alien-god of evolution gave us emotions that make us want to be happy and good...
Why did the alien-god give us emotions?
The alien-god does not act rationally. There is no “why.” The origin of emotion is the result of random chance. We can explain only its propogation.
Why should we be controlled by emotions that originated through random chance?
Wrong question. It’s not a matter of whether they should control us. It’s a fact that they do.
I pretty much agree. But I have one quibble that I think is worth mentioning. Someone else could just say, “No, that’s not what morality is. True morality is...”.
Actually, let me give you a chance to respond to that before elaborating. How would you respond to someone who says this?
Reduction time:
Very very well put. Much respect and applause.
One very small comment though:
The origin of emotion is the result of random chance.
I see where you’re coming from with this. If someone else heard this out of context they’d think, “No… emotion originates from evolutionary pressure”. But then you’d say, “Yeah, but where do the evolutionary pressures come from”. The other person would say, “Uh, ultimately the big bang I guess.” And you seem to be saying, “exactly, and that’s the result of random chance”.
Some math-y/physicist-y person might argue with you here about the big bang being random. I think you could provide a very valid bayesian counter argument saying that probability is in the mind, and that no one has a clue how the big bang/origin came to be, and so to anyone and everyone in this world, it is random.
Yeah, I have no clue what evolutionary pressure means, or what the big-bang is, or any of that science stuff yet. sigh I really don’t enjoy reading hard science all that much, but I enjoy ignorance even less, so I’ll probably try to educate myself more about that stuff soon after I finish the rationality book.
Ok, that’s perfectly fair. My honest opinion is that it really isn’t very practical and if it doesn’t interest you, it probably isn’t worth it. The value of it is really just if you’re curious about the nature of reality on a fundamental level. But as far as what’s practical, I think it’s skills like breaking things down like a reductionist, open mindedness, knowledge of what biases we’re prone to etc.
Yeah, I guess one person has only so much time… at least for now… I am curious, but maybe not quite enough to justify the immense amount of time and effort it would take me to thoroughly understand.
I pretty much agree. But I have one quibble that I think is worth mentioning. Someone else could just say, “No, that’s not what morality is. True morality is...”.
Example case:
True morality is following God’s will? Basically everyone who says this believes “God wants what’s best for us, even when we don’t understand it.” Their understanding of God’s will and their intuitive idea of what’s best for people rarely conflict though. But here’s an extreme example of when it could: Let’s say someone strongly believes (even in belief) in God, and for some reason thinks that God wants him to sacrifice his child. This action would go against his (unrecognized) terminal value of goodness, but he could still do it, subconsciously satisfying his (unrecognized) terminal value of personal happiness. He takes comfort in his belief in God and heaven. He takes comfort in his community. To not sacrifice the child would be to deny God and lose that comfort. These thoughts obviously don’t happen on a conscious level, but they could be intuitions?
Idk, feel free to throw more “true morality is...” scenarios at me...
Their understanding of God’s will and their intuitive idea of what’s best for people rarely conflict though.
What if it does conflict? Does that then change what morality is?
And to play devils advocate, suppose the person says, “I don’t care what you say, true morality is following God’s will no matter what the effect is on goodness or happiness.” Hint: they’re not wrong.
I hope I’m not being annoying. I could just make my point if you want.
But it seems like morality is just a word people use to describe how they think they should act! People think they should act in all sorts of ways, but it seems to me like they’re subconsciously acting to achieve happiness and/or goodness.
As for your quote… such a person would be very rare, because almost anyone who defines morality as God’s will believes that God’s will is good for humanity, even if she doesn’t understand why. This belief, and acting in accordance with it, brings her happiness in the form of security. I don’t think anyone says to herself “God has an evil will, but I will serve him anyway.” Do you?
But it seems like morality is just a word people use to describe how they think they should act!
It often is. My point is that morality is just a word, and that it unfortunately doesn’t have a well agreed upon meaning. And so someone could always just say “but I define it this way”.
And so to ask what morality is is really just asking how you define it. On the other hand, asking what someone’s altruism or preference ratios are is a concrete question.
You seem to be making the point that in practice, peoples definitions of morality usually can be traced back to happiness or goodness, even if they don’t know or admit it. I sense that you’re right.
Do you?
I doubt that there are many people who think that God has an evil will. But I could imagine that there are people who think that “even if I knew that God’s will was evil, following it would still be the right thing to do.”
I doubt that there are many people who think that God has an evil will. But I could imagine that there are people who think that “even if I knew that God’s will was evil, following it would still be the right thing to do.”
Sure. But any definition of “right” that gives that result is more or less baked into in the definition of “God’s will” (e.g. “God’s will is, by definition, right!”), and it’s not the sort of “right” I care about.
And so to ask what morality is is really just asking how you define it.
Yay, I got your point. Morality is definitely a more ambiguous term. You’ve helped me realize I shouldn’t use it synonymously with goodness.
You seem to be making the point that in practice, peoples definitions of morality usually can be traced back to happiness or goodness, even if they don’t know or admit it.
Yes, my point exactly.
But I could imagine that there are people who think that “even if I knew that God’s will was evil, following it would still be the right thing to do.”
I am trying really hard to imagine these people, and I can’t do it. Even if God’s will includes “justice” and killing anyone who doesn’t believe, even if it’s a baby whose only defect is “original sin,” people will still say that this “just” will of God’s is moral and right.
Good question. I conclude that morality (which, as far as I can tell, seems like the same thing as goodness and altruism) does exist, that our desire to be moral is the result of evolution (thanks for your scientific backup) just as much as our selfish desires are results of evolution. Whatever you call happiness, goodness falls into the same category. I think that some people are mystified when they make decisions that inefficiently optimize their happiness (like all those examples we talked about), but they shouldn’t be. Goodness is a terminal value too.
Also, morality is relative. How moral you are can be measured by some kind of altruism ratio that compares your terminal values of happiness and goodness. Someone can be “more moral” than others in the sense that he would be motivated more by goodness/altruism than he is by his own personal satisfaction, relative to them.
Is there any value in this idea? No practical value, except whatever personal satisfaction value an individual assigns to clarity. I wouldn’t even call the idea a conclusion as much as a way to describe the things I understand in a slightly more clear way. I still don’t particularly like ends-in-themselves.
Reduction time:
Why should I pursue clarity or donate to effective charities that are sub-optimal happiness-maximizers?
Because those are instrumental values.
Why should I pursue these instrumental values?
Because they lead to happiness and goodness.
Why should I pursue happiness and goodness?
Because they’re terminal values.
Why should I pursue these terminal values?
Wrong question. Terminal values, by definition, are ends-in-themselves. So here the real question is not why should I, but rather, why do I pursue them? It’s because the alien-god of evolution gave us emotions that make us want to be happy and good...
Why did the alien-god give us emotions?
The alien-god does not act rationally. There is no “why.” The origin of emotion is the result of random chance. We can explain only its propogation.
Why should we be controlled by emotions that originated through random chance?
Wrong question. It’s not a matter of whether they should control us. It’s a fact that they do.
I pretty much agree. But I have one quibble that I think is worth mentioning. Someone else could just say, “No, that’s not what morality is. True morality is...”.
Actually, let me give you a chance to respond to that before elaborating. How would you respond to someone who says this?
Very very well put. Much respect and applause.
One very small comment though:
I see where you’re coming from with this. If someone else heard this out of context they’d think, “No… emotion originates from evolutionary pressure”. But then you’d say, “Yeah, but where do the evolutionary pressures come from”. The other person would say, “Uh, ultimately the big bang I guess.” And you seem to be saying, “exactly, and that’s the result of random chance”.
Some math-y/physicist-y person might argue with you here about the big bang being random. I think you could provide a very valid bayesian counter argument saying that probability is in the mind, and that no one has a clue how the big bang/origin came to be, and so to anyone and everyone in this world, it is random.
Thanks :)
Yeah, I have no clue what evolutionary pressure means, or what the big-bang is, or any of that science stuff yet. sigh I really don’t enjoy reading hard science all that much, but I enjoy ignorance even less, so I’ll probably try to educate myself more about that stuff soon after I finish the rationality book.
Ok, that’s perfectly fair. My honest opinion is that it really isn’t very practical and if it doesn’t interest you, it probably isn’t worth it. The value of it is really just if you’re curious about the nature of reality on a fundamental level. But as far as what’s practical, I think it’s skills like breaking things down like a reductionist, open mindedness, knowledge of what biases we’re prone to etc.
Yeah, I guess one person has only so much time… at least for now… I am curious, but maybe not quite enough to justify the immense amount of time and effort it would take me to thoroughly understand.
Example case:
True morality is following God’s will? Basically everyone who says this believes “God wants what’s best for us, even when we don’t understand it.” Their understanding of God’s will and their intuitive idea of what’s best for people rarely conflict though. But here’s an extreme example of when it could: Let’s say someone strongly believes (even in belief) in God, and for some reason thinks that God wants him to sacrifice his child. This action would go against his (unrecognized) terminal value of goodness, but he could still do it, subconsciously satisfying his (unrecognized) terminal value of personal happiness. He takes comfort in his belief in God and heaven. He takes comfort in his community. To not sacrifice the child would be to deny God and lose that comfort. These thoughts obviously don’t happen on a conscious level, but they could be intuitions?
Idk, feel free to throw more “true morality is...” scenarios at me...
What if it does conflict? Does that then change what morality is?
And to play devils advocate, suppose the person says, “I don’t care what you say, true morality is following God’s will no matter what the effect is on goodness or happiness.” Hint: they’re not wrong.
I hope I’m not being annoying. I could just make my point if you want.
But it seems like morality is just a word people use to describe how they think they should act! People think they should act in all sorts of ways, but it seems to me like they’re subconsciously acting to achieve happiness and/or goodness.
As for your quote… such a person would be very rare, because almost anyone who defines morality as God’s will believes that God’s will is good for humanity, even if she doesn’t understand why. This belief, and acting in accordance with it, brings her happiness in the form of security. I don’t think anyone says to herself “God has an evil will, but I will serve him anyway.” Do you?
It often is. My point is that morality is just a word, and that it unfortunately doesn’t have a well agreed upon meaning. And so someone could always just say “but I define it this way”.
And so to ask what morality is is really just asking how you define it. On the other hand, asking what someone’s altruism or preference ratios are is a concrete question.
You seem to be making the point that in practice, peoples definitions of morality usually can be traced back to happiness or goodness, even if they don’t know or admit it. I sense that you’re right.
I doubt that there are many people who think that God has an evil will. But I could imagine that there are people who think that “even if I knew that God’s will was evil, following it would still be the right thing to do.”
Sure. But any definition of “right” that gives that result is more or less baked into in the definition of “God’s will” (e.g. “God’s will is, by definition, right!”), and it’s not the sort of “right” I care about.
I think that’s what it often comes down to.
Yay, I got your point. Morality is definitely a more ambiguous term. You’ve helped me realize I shouldn’t use it synonymously with goodness.
Yes, my point exactly.
I am trying really hard to imagine these people, and I can’t do it. Even if God’s will includes “justice” and killing anyone who doesn’t believe, even if it’s a baby whose only defect is “original sin,” people will still say that this “just” will of God’s is moral and right.
Hmm. Well you know a ton more about this than me so I believe you.