In this case, you are, implicitly, making the claim that the difference between men and women is large enough that it makes sense to try to draw an analogy to the difference between humans and AIs, even though you explicitly stated that of course the difference is not as large.
No I’m not! Men and women are the same on any “human to AI” dimension! The analogy doesn’t rest on differences between men and women, except that there’s a desire to align in that direction, as described, coming from different incentives. I’m not making this claim that you’re saying I’m making! It’s other people’s fault if they make up an interpretation that I didn’t say and then ding me for saying that thing I didn’t say. The only analogy is that it’s a general intelligence trying to align another general intelligence.
I don’t actually see what using women and men here adds to the analogy
It’s an especially strong case of incentive to interpersonally suss out intentions. It’s the strongest one I could think of. What are some other very strong cases?
in a way that I think is meaningfully wrong
Why do you think it’s meaningfully wrong? Do you mean incorrect, or morally wrong?
No I’m not! Men and women are the same on any “human to AI” dimension! The analogy doesn’t rest on differences between men and women, except that there’s a desire to align in that direction, as described, coming from different incentives. I’m not making this claim that you’re saying I’m making! It’s other people’s fault if they make up an interpretation that I didn’t say and then ding me for saying that thing I didn’t say. The only analogy is that it’s a general intelligence trying to align another general intelligence.
It’s an especially strong case of incentive to interpersonally suss out intentions. It’s the strongest one I could think of. What are some other very strong cases?
Why do you think it’s meaningfully wrong? Do you mean incorrect, or morally wrong?