I have made roughly this argument for relative moral weight, but I’m not comfortable with it.
I entirely agree that the subjective “volume” of pain is more likely tuned by evolution; (edit:) but the functional effectiveness of the pain signal doesn’t seem to be what we care about or give moral worth to, but rather the degree of suffering, which must be based on some property of the information processing in the brain, and therefore likely related to brain complexity.
For me neuron count is a very rough approximation based on reasoning that any reasonable way of defining moral worth must be at least on a continuum. It seems very strange to suppose that moral worth (or the type of consciousness that confers it) it suddenly appears when a critical threshold is passed, and is entirely absent just below that threshold. One bear, beetle,or bacterium would have had no consciousness or moral worth, and then suddenly its offspring has them in full while being nearly indistinguishable in behavior.
I’ve had the opportunity to think about neural substrates of consciousness in relatively a lot of depth. I still don’t have a good definition (and think it’s ultimately a matter of preference) to whom we assign moral worth. But to even approach being a sensible and internally consistent position, it seems like it’s got to be a continuous value. And neuron count is as close as I can get, since that’s a very rough proxy for the richness of information processing in that system on every dimension. So whichever one(s) we settle on, neuron count will be in the wild ballpark.
A better final answer will count only the neurons and synapses contributing to whatever-it-is and will probably count them as a nonlinear function of some sort, and go into more depth. But neuron count is the best starting point I can think of.
I have made roughly this argument for relative moral weight, but I’m not comfortable with it.
I entirely agree that the subjective “volume” of pain is more likely tuned by evolution; (edit:) but the functional effectiveness of the pain signal doesn’t seem to be what we care about or give moral worth to, but rather the degree of suffering, which must be based on some property of the information processing in the brain, and therefore likely related to brain complexity.
For me neuron count is a very rough approximation based on reasoning that any reasonable way of defining moral worth must be at least on a continuum. It seems very strange to suppose that moral worth (or the type of consciousness that confers it) it suddenly appears when a critical threshold is passed, and is entirely absent just below that threshold. One bear, beetle,or bacterium would have had no consciousness or moral worth, and then suddenly its offspring has them in full while being nearly indistinguishable in behavior.
I’ve had the opportunity to think about neural substrates of consciousness in relatively a lot of depth. I still don’t have a good definition (and think it’s ultimately a matter of preference) to whom we assign moral worth. But to even approach being a sensible and internally consistent position, it seems like it’s got to be a continuous value. And neuron count is as close as I can get, since that’s a very rough proxy for the richness of information processing in that system on every dimension. So whichever one(s) we settle on, neuron count will be in the wild ballpark.
A better final answer will count only the neurons and synapses contributing to whatever-it-is and will probably count them as a nonlinear function of some sort, and go into more depth. But neuron count is the best starting point I can think of.