There’s how I see this issue (from philosophical point of view):
Moral value is, in the most general form, a function of a state of a structure, for lack of better word. The structure may be just 10 neurons in isolation, for which the moral worth may well be exact zero, or it may be 7 billion blobs of about 10^11 neurons who communicate with each other, or it may be a lot of data on a hard drive, representing a stored upload.
The moral value of two interconnected structures, in general, does not equal the sum of moral value of each structure (example: whole brain vs piece of brain, a mind on redundant hardware). Moral value of whole can (in general) be greater or less than sum of moral values of the parts. Note that I have not defined anything specific at all here, I just specified very general considerations. We have developed somewhat ad hoc approximations to some sort of ideal moral worth.
edit: Note. The moral worth of an action is in general a function of state without the action and state with the action, not necessarily difference in moral worth of one state, and moral worth of other state.
The utilitarianism of the N dustspecks worse than torture variety takes as fundamental and the ideal plenty of assumptions such as that moral worth would be distributive like W(a .. b) = W(a) + W(b) , to which we have clear counter-examples when the parts are strongly interconnected (e.g. 2 hemispheres of a brain) or correlated (double redundant hardware) but which may hold approximately for people due to them not being strongly interconnected. With very large N, clearly broken premise is taken to their extreme and then proclaimed normative, while the approximations that aren’t linear are proclaimed wrong.
There’s how I see this issue (from philosophical point of view):
Moral value is, in the most general form, a function of a state of a structure, for lack of better word. The structure may be just 10 neurons in isolation, for which the moral worth may well be exact zero, or it may be 7 billion blobs of about 10^11 neurons who communicate with each other, or it may be a lot of data on a hard drive, representing a stored upload.
The moral value of two interconnected structures, in general, does not equal the sum of moral value of each structure (example: whole brain vs piece of brain, a mind on redundant hardware). Moral value of whole can (in general) be greater or less than sum of moral values of the parts. Note that I have not defined anything specific at all here, I just specified very general considerations. We have developed somewhat ad hoc approximations to some sort of ideal moral worth.
edit: Note. The moral worth of an action is in general a function of state without the action and state with the action, not necessarily difference in moral worth of one state, and moral worth of other state.
The utilitarianism of the N dustspecks worse than torture variety takes as fundamental and the ideal plenty of assumptions such as that moral worth would be distributive like W(a .. b) = W(a) + W(b) , to which we have clear counter-examples when the parts are strongly interconnected (e.g. 2 hemispheres of a brain) or correlated (double redundant hardware) but which may hold approximately for people due to them not being strongly interconnected. With very large N, clearly broken premise is taken to their extreme and then proclaimed normative, while the approximations that aren’t linear are proclaimed wrong.