Cute, but considering how contorted your position has turned out to be, you can forgive me for wondering if you wanted to stick with it.
Hmm. I thought that I laid it out very cleanly here.
And that’s what I mean: on top of the already contorted position I attributed to you, you’re adding this moral-or-maybe-something else wrongness, which has no precedent in your earlier justifications. Do you think it’s probably one of the non-moral-wrongness things? Is that just a matter of terminology?
I think that it’s probably moral wrongness, but I’m less certain, so I’m more cautious about attributing that view to her.
But, at any rate, I honestly don’t see the contortions to which you refer. Perhaps she would experience a certain increase in pleasure if she modified herself not to dislike you. If she has this power, but chooses not to use it, then you may conclude that she cares about something more than that pleasure. It’s sort of like how Ghandi wouldn’t take a pill to make himself like to kill people, even if he knew that he would have lots of opportunities to kill people at no cost. There is a very standard distinction between what you think you ought to do and what you think will give you the most pleasure. I would expect the inferential distance on LW for this to be very short. That is why I don’t see my position as contorted.
Give me just a little credit here: yes I do understand the difference between “this increases my pleasure” and “I should do this”; and yes, there should be low inferential distance on explaining such a point on LW. That’s wasn’t in dispute. What’s in dispute is how much contortion you have to go through to justify why that distinction would be relevant and applicable here (which even the contortion leaves out).
And you didn’t lay it out very cleanly in the linked comment: you just made one distinction that is a very small part of what you have to say to specify your position.
My view is that Alicorn probably perceives certain benefits from not disliking you, such as the ones you’ve enumerated. But evidently she also sees other costs from not disliking you (costs which are probably moral). In her estimation (which I think is incorrect) the costs outweigh the benefits. Therefore, she has chosen not to apply the advice in the OP.
What’s contorted about that? As I see it, I’m just taking her revealed preferences at face value, while giving her the benefit of the doubt that she has the powers described in the OP.
Hmm. I thought that I laid it out very cleanly here.
I think that it’s probably moral wrongness, but I’m less certain, so I’m more cautious about attributing that view to her.
But, at any rate, I honestly don’t see the contortions to which you refer. Perhaps she would experience a certain increase in pleasure if she modified herself not to dislike you. If she has this power, but chooses not to use it, then you may conclude that she cares about something more than that pleasure. It’s sort of like how Ghandi wouldn’t take a pill to make himself like to kill people, even if he knew that he would have lots of opportunities to kill people at no cost. There is a very standard distinction between what you think you ought to do and what you think will give you the most pleasure. I would expect the inferential distance on LW for this to be very short. That is why I don’t see my position as contorted.
Give me just a little credit here: yes I do understand the difference between “this increases my pleasure” and “I should do this”; and yes, there should be low inferential distance on explaining such a point on LW. That’s wasn’t in dispute. What’s in dispute is how much contortion you have to go through to justify why that distinction would be relevant and applicable here (which even the contortion leaves out).
And you didn’t lay it out very cleanly in the linked comment: you just made one distinction that is a very small part of what you have to say to specify your position.
My view is that Alicorn probably perceives certain benefits from not disliking you, such as the ones you’ve enumerated. But evidently she also sees other costs from not disliking you (costs which are probably moral). In her estimation (which I think is incorrect) the costs outweigh the benefits. Therefore, she has chosen not to apply the advice in the OP.
What’s contorted about that? As I see it, I’m just taking her revealed preferences at face value, while giving her the benefit of the doubt that she has the powers described in the OP.