Yudkowsky says that public morality should be derived from personal morality, and that personal morality is paramount. But I don’t think this is the right way to put it, in my view morality is the social relationships that game theory talks about, how not to play games with a negative sum, how to achieve the maximum sum for all participants.
And morality is independent of values, or rather, each value system has its own morality, or even more accurately, morality can work even if you have different value systems. Morality is primarily about questions of justice, sometimes all sorts of superfluous things like god worship are dragged under this kind of human sentiment, so morality and justice may not be exactly equivalent.
And game theory and answers questions about how to achieve justice. Also, justice may concern you as directly one of your values, and then you won’t betray even in a one-time prisoner’s dilemma without penalty. Or it may not bother you and then you will pass on always when you do not expect to be punished for it.
In other words, morality is universal between value systems, but it cannot be independent of them. It makes no sense to forbid someone to be hurt if he has absolutely nothing against being hurt.
In other words, I mean that adherence to morality just feels different from inside than conformity to your values, the former feels like an obligation and the latter feels like a desire, in one case you say “should” and in the other “wants.”
I’ve read “Sorting Pebbles into Different Piles” several times and never understood what it was about until it was explained to me. Certainly the sorters aren’t arguing about morality, but that’s because they’re not arguing about game theory, they’re arguing about fun theory… Or more accurately not really, they are pure consequentialists after all, they don’t care about fun or their lives, only piles into external reality, so it’s theory of value, but not theory of fun, but theory of prime.
But in any case, I think people might well argue with them about morality. If people can sell primes to sorters and they can sell hedons to people, would it be moral to betray in a prisoner’s dilemma and get 2 primes by giving −3 hedons. And most likely they will come to the conclusion that no, that would be wrong, even if it is just (“prime”).
That you shouldn’t kill people, even if you can get yourself the primeons you so desire, and they shouldn’t destroy the right piles, even if they get pleasure from looking at the blowing pebbles.
There is convergently useful knowledge, and parameters of preference that could be anything, in a new mind. You don’t need to align the former. There are no compelling arguments about the latter.
Yudkowsky says that public morality should be derived from personal morality, and that personal morality is paramount. But I don’t think this is the right way to put it, in my view morality is the social relationships that game theory talks about, how not to play games with a negative sum, how to achieve the maximum sum for all participants.
And morality is independent of values, or rather, each value system has its own morality, or even more accurately, morality can work even if you have different value systems. Morality is primarily about questions of justice, sometimes all sorts of superfluous things like god worship are dragged under this kind of human sentiment, so morality and justice may not be exactly equivalent.
And game theory and answers questions about how to achieve justice. Also, justice may concern you as directly one of your values, and then you won’t betray even in a one-time prisoner’s dilemma without penalty. Or it may not bother you and then you will pass on always when you do not expect to be punished for it.
In other words, morality is universal between value systems, but it cannot be independent of them. It makes no sense to forbid someone to be hurt if he has absolutely nothing against being hurt.
In other words, I mean that adherence to morality just feels different from inside than conformity to your values, the former feels like an obligation and the latter feels like a desire, in one case you say “should” and in the other “wants.”
I’ve read “Sorting Pebbles into Different Piles” several times and never understood what it was about until it was explained to me. Certainly the sorters aren’t arguing about morality, but that’s because they’re not arguing about game theory, they’re arguing about fun theory… Or more accurately not really, they are pure consequentialists after all, they don’t care about fun or their lives, only piles into external reality, so it’s theory of value, but not theory of fun, but theory of prime.
But in any case, I think people might well argue with them about morality. If people can sell primes to sorters and they can sell hedons to people, would it be moral to betray in a prisoner’s dilemma and get 2 primes by giving −3 hedons. And most likely they will come to the conclusion that no, that would be wrong, even if it is just (“prime”).
That you shouldn’t kill people, even if you can get yourself the primeons you so desire, and they shouldn’t destroy the right piles, even if they get pleasure from looking at the blowing pebbles.
There is convergently useful knowledge, and parameters of preference that could be anything, in a new mind. You don’t need to align the former. There are no compelling arguments about the latter.