Should I try to stop myself loving the doll? If I ask what love is for,
Asking what purpose love evolved for...
then it seems I should.
What is doing the seeming here? Is your built-in morality, which makes no direct reference to evolutionary arguments, but which might be swayed by them, evaluating this argument and coming to a conclusion?
If the change in morality suggested by an evolutionary line of reasoning were repugnant to you, would you reject it? Then you’re not putting the cart before the horse, and good for you. Eliezer’s talking about different people.
This seems useful in my book,
Only to the extent that it gives answers you’re happy with by other, more primary criterion. And to that extent, it’s just one more kind of moral argument.
“This is just some random text to see if I can get my comment to go through.”
Should I try to stop myself loving the doll? If I ask what love is for,
Asking what purpose love evolved for...
then it seems I should.
What is doing the seeming here? Is your built-in morality, which makes no direct reference to evolutionary arguments, but which might be swayed by them, evaluating this argument and coming to a conclusion?
If the change in morality suggested by an evolutionary line of reasoning were repugnant to you, would you reject it? Then you’re not putting the cart before the horse, and good for you. Eliezer’s talking about different people.
This seems useful in my book,
Only to the extent that it gives answers you’re happy with by other, more primary criterion. And to that extent, it’s just one more kind of moral argument.
“This is just some random text to see if I can get my comment to go through.”