For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it’s not wrapped up in all this unnecessary weird stuff.
I’d consider EA itself to be one of those strange things that LW has as part of its identity. It’s true that EA involves rationality, but the premises that EA is based on are profoundly weird. I have no desire to maximize utility for the entire human race in such a way that each person’s utility counts equally, and neither does just about everyone else outside of the LW-sphere. I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you’ll find that pretty much everyone else outside of here does too.
I don’t view this as inconsistent with EA. I basically share the same preferences as you (except that I don’t think I care about countrymen more than arbitrary people). On the other hand, I care a non-zero amount about arbitrary people, and I would like whatever resources I spend helping them to be spent efficiently. (Also, given the sheer number of other people, things like scientific research that would potentially benefit everyone at once feel pretty appealing to me.)
Well, that’s a matter of semantics. I could say “I don’t want to maximize utility added up among all people”, or I could say “I assign greater utility to people closer to me, and I want to maximize utility given that assignment”. Is that EA? If you phrase it the second way, it sort of is, but if you phrase it the first, it isn’t.
Also, I probably should add “and people who think like me” after “countrymen”. For instance, I don’t really care about the negative utility some people get when others commit blasphemy.
I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you’ll find that pretty much everyone else outside of here does too.
I think there are plenty of people out there who do care to some extend about saving starving African children.
Yes, they care to some extent, but they would still prefer saving their own child from starvation to saving another child in a distant continent from starvation. Caring to some extent is not equally preferring.
The argument usually goes in reverse: since you’d care about your own child, surely you should care equally about this child in Africa who’s just as human. It’s presented as a reason to care more for the distant child, not care less for your own child. But it still implies that you should care equally about them, not care more about your own.
I’d consider EA itself to be one of those strange things that LW has as part of its identity. It’s true that EA involves rationality, but the premises that EA is based on are profoundly weird. I have no desire to maximize utility for the entire human race in such a way that each person’s utility counts equally, and neither does just about everyone else outside of the LW-sphere. I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you’ll find that pretty much everyone else outside of here does too.
I don’t view this as inconsistent with EA. I basically share the same preferences as you (except that I don’t think I care about countrymen more than arbitrary people). On the other hand, I care a non-zero amount about arbitrary people, and I would like whatever resources I spend helping them to be spent efficiently. (Also, given the sheer number of other people, things like scientific research that would potentially benefit everyone at once feel pretty appealing to me.)
Well, that’s a matter of semantics. I could say “I don’t want to maximize utility added up among all people”, or I could say “I assign greater utility to people closer to me, and I want to maximize utility given that assignment”. Is that EA? If you phrase it the second way, it sort of is, but if you phrase it the first, it isn’t.
Also, I probably should add “and people who think like me” after “countrymen”. For instance, I don’t really care about the negative utility some people get when others commit blasphemy.
this was an unhelpful comment, removed and replaced by the comment you are now reading
I think there are plenty of people out there who do care to some extend about saving starving African children.
Yes, they care to some extent, but they would still prefer saving their own child from starvation to saving another child in a distant continent from starvation. Caring to some extent is not equally preferring.
I don’t think any of the EA people wouldn’t care more about their own child. To me that seems like a strawman.
The argument usually goes in reverse: since you’d care about your own child, surely you should care equally about this child in Africa who’s just as human. It’s presented as a reason to care more for the distant child, not care less for your own child. But it still implies that you should care equally about them, not care more about your own.
I don’t know any EA who says that they have an utility function that treats every child 100% equally.