Uhm, it’s evident I’ve not made my argument very clear (which is not a surprise, since I’ve wrote that stub in less than a minute).
Let me rephrase it in a way that addresses your point:
“Value alignment is very hard, because two people, both very intelligent and with a mostly coherent epistemology, can radically differ in their values because of a very tiny difference. Think for example to neoreactionaries or rationalists converted to religions: they have a mostly coherent set of beliefs, often diverging from atheist or progressive rationalists in very few points.”
They want to steer the future in a different direction than what I want, so by definition they have different values (they might be instrumental values, but those are important too).
Uhm, it’s evident I’ve not made my argument very clear (which is not a surprise, since I’ve wrote that stub in less than a minute).
Let me rephrase it in a way that addresses your point:
“Value alignment is very hard, because two people, both very intelligent and with a mostly coherent epistemology, can radically differ in their values because of a very tiny difference.
Think for example to neoreactionaries or rationalists converted to religions: they have a mostly coherent set of beliefs, often diverging from atheist or progressive rationalists in very few points.”
I am saying that is a difference in belief, not in values, or not necessarily in values.
They want to steer the future in a different direction than what I want, so by definition they have different values (they might be instrumental values, but those are important too).
Ok, but in this sense every human being has different values by definition, and always will.