I used to have a negative visceral reaction to the idea of authority, and then decided after much thought that it wasn’t so bad and in fact kind of nice.
Hm… so if you change your mind about a value, does it no longer qualify as a fundamental value? I’m not sure if we are using the word “value” in the same way.
I think it was you posted a few months ago about moral uncertainty, and I think you also posted that humans are poorly described by utility functions.
If you believe that, you should agree that we don’t necessarily even have an actual set of moral axioms, underlying all the uncertainty and signaling. The term “fundamental value” implicitly implies a moral axiom in a utility function—and while it is a useful term under most contexts, I think it should be deconstructed for this conversation.
For most people, under the right conditions murder and torture can seem like a good idea. Smart people might act more as if they were under a set of axioms, but that’s just because they work hard at being consistent because inconsistency causes them negative feelings.
So when I say “different values” this is what I mean:
1) John’s anterior cingulate cortex doesn’t light up brightly in response to conflict. He thus does not feel any dissonance when believing two contradictory statements, and is not motivated to re-evaluate his model. Thus, he does not value consistency like me—we have different values. Understanding this, I don’t try to convince him of things by appealing to logical consistency, instead appealing directly to other instincts.
2) Sally’s amygdala activates in response to incest, thanks to the Westermarck instinct. She thus has more motivation to condemn incest between two consenting parties, even when there is no risk of children being involved.
Mine lights up in disgust too, but to a much lesser extent. I’d probably be against incest too, but I’ve set up a hopefully consistent memetic complex of values to prevent my ACC from bothering the rest of my brain, and being against incest would destroy the consistency.
Our values are thus different—Sally’s disgust became moral condemnation, my disgust is just a squick. If Sally could give me a reason to be against incest which didn’t create inconsistency for me, she might well change my view. If she’s also one of those that values consistency, I can change her view by pointing out the inconsistency. Or, I can desensitize her instinctive disgust through conditioning by showing her pictures and video of happy, healthy incestuous couples in love talking about their lives and struggles.
3) Bob has the connections from his ventromedial prefrontal cortex to his amygdala severed. He thus is not bothered by other people’s pain. I watch Bob carefully: because of the fact that he does not factor in other people’s pain into his calculations about his next action, I’m afraid he might hurt people I care about, which would bother me a lot. We have different values—but I can still influence Bob by appealing to his honor. He might still be motivated to genuinely respect authority, or to follow purity rules. If he’s like Sally, he might condemn incest “because it is gross”, but the feelings of inbred children might not weigh on his mind at all.
Basically, I see “values” as partly a set of ideas and partly an extension of “personality”. You can change someone’s values through argument, conditioning, etc...but between people there are often differences in the underlying motives which drive value creation, along with the layers of memetics.
(brain parts are roughly in line with current knowledge understanding of what they do, but take it with a grain of salt—the underlying point is more important)
Hm… so if you change your mind about a value, does it no longer qualify as a fundamental value? I’m not sure if we are using the word “value” in the same way.
I think it was you posted a few months ago about moral uncertainty, and I think you also posted that humans are poorly described by utility functions.
If you believe that, you should agree that we don’t necessarily even have an actual set of moral axioms, underlying all the uncertainty and signaling. The term “fundamental value” implicitly implies a moral axiom in a utility function—and while it is a useful term under most contexts, I think it should be deconstructed for this conversation.
For most people, under the right conditions murder and torture can seem like a good idea. Smart people might act more as if they were under a set of axioms, but that’s just because they work hard at being consistent because inconsistency causes them negative feelings.
So when I say “different values” this is what I mean:
1) John’s anterior cingulate cortex doesn’t light up brightly in response to conflict. He thus does not feel any dissonance when believing two contradictory statements, and is not motivated to re-evaluate his model. Thus, he does not value consistency like me—we have different values. Understanding this, I don’t try to convince him of things by appealing to logical consistency, instead appealing directly to other instincts.
2) Sally’s amygdala activates in response to incest, thanks to the Westermarck instinct. She thus has more motivation to condemn incest between two consenting parties, even when there is no risk of children being involved.
Mine lights up in disgust too, but to a much lesser extent. I’d probably be against incest too, but I’ve set up a hopefully consistent memetic complex of values to prevent my ACC from bothering the rest of my brain, and being against incest would destroy the consistency.
Our values are thus different—Sally’s disgust became moral condemnation, my disgust is just a squick. If Sally could give me a reason to be against incest which didn’t create inconsistency for me, she might well change my view. If she’s also one of those that values consistency, I can change her view by pointing out the inconsistency. Or, I can desensitize her instinctive disgust through conditioning by showing her pictures and video of happy, healthy incestuous couples in love talking about their lives and struggles.
3) Bob has the connections from his ventromedial prefrontal cortex to his amygdala severed. He thus is not bothered by other people’s pain. I watch Bob carefully: because of the fact that he does not factor in other people’s pain into his calculations about his next action, I’m afraid he might hurt people I care about, which would bother me a lot. We have different values—but I can still influence Bob by appealing to his honor. He might still be motivated to genuinely respect authority, or to follow purity rules. If he’s like Sally, he might condemn incest “because it is gross”, but the feelings of inbred children might not weigh on his mind at all.
Basically, I see “values” as partly a set of ideas and partly an extension of “personality”. You can change someone’s values through argument, conditioning, etc...but between people there are often differences in the underlying motives which drive value creation, along with the layers of memetics.
(brain parts are roughly in line with current knowledge understanding of what they do, but take it with a grain of salt—the underlying point is more important)