To me the core of neuron counting as an intuition is that all living beings seem to have a depth to their reactions that scales with the size of their mind. There’s a richness to a human mind in its reactions to the world which other animals don’t have, just as dogs have a deeper interaction with everything than insects do.
This is pretty strongly correlated with our emotions for why/when we care about creatures, how much we ‘recognize’ their depth. This is why people are most often interested when learning that certain animals have more depth than we might intuitively think.
As for whether there is an article, I don’t know of any that I like, but I’ll lay out some thoughts. This will be somewhat rambly, in part to try to give some stronger reasons, but also related ideas that aren’t spelled out enough.
One important consideration I often have to keep in mind in these sorts of discussions, is that when we evaluate moral worth, we do not just care about instantaneous pleasure/pain, but rather an intricate weighting of hundreds of different considerations. This very well may mean that we care about weighting by richness of mind, even if we determine that a scale would say that two beings experience the ~same level of pain.
Duplication: If we aren’t weighting by ‘richness of mind’ or some related factor, then we still end up with a similar weighting factor by not considering the mind as one solid thing with a core singular self receiving input.
If a simple system can have pain just as intense as a more complex system, then why wouldn’t the subsystems within a large brain have their own intense ‘experiences’?
I experience a twinge of discomfort when thinking of an embarrassing event some years ago. To my ‘self’ this is a relatively minor pain, but my brain is using substantially more neurons than lie within a beetle. More subsystems fire. While the core mind handles this as minor sensation, that small subsystem of the mind may be receiving a big update locally, it is just the architect overseeing everything else doesn’t need to directly perceive the sensation as more than a small hit.
A mild form of this drives the “cognitive complexity ⇒ pain is more impactful” intuition. Pain filters through multiple layers of your mind, updating them. To an insect, this is simple conditioning to avoid-on-sense. For a dog, similar but with added updates to local context. For humans, it can have farther reaching consequences of how much they trust others, themselves, their safety locally but also in general. A mouse may just get a “don’t go there” when shocked, while a human gets “don’t go there; not safe; I hate being wrong about being safe”, etc.
I would not expect pleasure and pain to become more intense merely because the brain happens to have more neurons
To point at this specifically, the richness of mind provides an answer of it being because pain ties into far more.
While my eyes and the eyes of a mouse are likely both providing a similar sense of “BLUE=true, LIGHT=true, SKY=true” when looking up during the sky, by the time it reaches my ‘self’ there is a massive amount more implicature and feeling embedded in that sensation. Mice have their instincts give a sense of openness and wariness of predators paired with a handful of learned associations in life. Humans have all the ingrained instincts, openness, warmth, life, safety, learned associations like variation based on the precise tinge and cloudiness of the sky, specific times in their life, and so on.
In a way, I view collapsing these all under one sensation like “seeing-sky” as very reductive. While they both effectively have “SEEING SKY=true”, it is more that the human is experiencing dozens of different sensations while the mouse is experiencing half a dozen.
I find it very plausible that pleasure/pain is similar. We do not just get a “PAIN=true”, we get a burst of a dozen of different sensations related to the pain. Different reactions to those sensations bursting out from the mind.
This sort of bundling under one term while ignoring volume is very questionable. If we take the naive application of ‘PAIN=true’, then we would consider a mind that can do lots of parallel processing as having the same degree of pain when it receives
This is similar but not quite the same as the Duplication view, where Duplication is more about isolated subcircuits of the brain mattering, where this section is about the ‘self’ actually receiving a lot more inputs where bundling of concepts obscures the reality. I think a lot of this is because of iffy ontology, where human intuition is tracking some of these factors, but haven’t been pinned down and so is hard to talk about for most people.
To me the core of neuron counting as an intuition is that all living beings seem to have a depth to their reactions that scales with the size of their mind. There’s a richness to a human mind in its reactions to the world which other animals don’t have, just as dogs have a deeper interaction with everything than insects do. This is pretty strongly correlated with our emotions for why/when we care about creatures, how much we ‘recognize’ their depth. This is why people are most often interested when learning that certain animals have more depth than we might intuitively think.
As for whether there is an article, I don’t know of any that I like, but I’ll lay out some thoughts. This will be somewhat rambly, in part to try to give some stronger reasons, but also related ideas that aren’t spelled out enough.
One important consideration I often have to keep in mind in these sorts of discussions, is that when we evaluate moral worth, we do not just care about instantaneous pleasure/pain, but rather an intricate weighting of hundreds of different considerations. This very well may mean that we care about weighting by richness of mind, even if we determine that a scale would say that two beings experience the ~same level of pain.
Duplication: If we aren’t weighting by ‘richness of mind’ or some related factor, then we still end up with a similar weighting factor by not considering the mind as one solid thing with a core singular self receiving input. If a simple system can have pain just as intense as a more complex system, then why wouldn’t the subsystems within a large brain have their own intense ‘experiences’? I experience a twinge of discomfort when thinking of an embarrassing event some years ago. To my ‘self’ this is a relatively minor pain, but my brain is using substantially more neurons than lie within a beetle. More subsystems fire. While the core mind handles this as minor sensation, that small subsystem of the mind may be receiving a big update locally, it is just the architect overseeing everything else doesn’t need to directly perceive the sensation as more than a small hit.
A mild form of this drives the “cognitive complexity ⇒ pain is more impactful” intuition. Pain filters through multiple layers of your mind, updating them. To an insect, this is simple conditioning to avoid-on-sense. For a dog, similar but with added updates to local context. For humans, it can have farther reaching consequences of how much they trust others, themselves, their safety locally but also in general. A mouse may just get a “don’t go there” when shocked, while a human gets “don’t go there; not safe; I hate being wrong about being safe”, etc.
To point at this specifically, the richness of mind provides an answer of it being because pain ties into far more. While my eyes and the eyes of a mouse are likely both providing a similar sense of “BLUE=true, LIGHT=true, SKY=true” when looking up during the sky, by the time it reaches my ‘self’ there is a massive amount more implicature and feeling embedded in that sensation. Mice have their instincts give a sense of openness and wariness of predators paired with a handful of learned associations in life. Humans have all the ingrained instincts, openness, warmth, life, safety, learned associations like variation based on the precise tinge and cloudiness of the sky, specific times in their life, and so on. In a way, I view collapsing these all under one sensation like “seeing-sky” as very reductive. While they both effectively have “SEEING SKY=true”, it is more that the human is experiencing dozens of different sensations while the mouse is experiencing half a dozen. I find it very plausible that pleasure/pain is similar. We do not just get a “PAIN=true”, we get a burst of a dozen of different sensations related to the pain. Different reactions to those sensations bursting out from the mind.
This sort of bundling under one term while ignoring volume is very questionable. If we take the naive application of ‘PAIN=true’, then we would consider a mind that can do lots of parallel processing as having the same degree of pain when it receives
This is similar but not quite the same as the Duplication view, where Duplication is more about isolated subcircuits of the brain mattering, where this section is about the ‘self’ actually receiving a lot more inputs where bundling of concepts obscures the reality. I think a lot of this is because of iffy ontology, where human intuition is tracking some of these factors, but haven’t been pinned down and so is hard to talk about for most people.