It’s not clear to me that a species which showed strong behavioural evidence of consciousness and valenced experience should have their welfare strongly discounted using neuron count.
You can’t have “strong behavioral evidence of consciousness”. At least in my theory of mind it is clear that you need to understand what is going on inside of a mind to get strong evidence.
Like, modern video game characters (without any use of AI) would also check a huge number of these “behavioral evidence” checkboxes, and really very obviously aren’t conscious or moral patients of non-negligible weight.
You also have subdivision issues. Like, by this logic you end up with thinking that a swarm of fish is less morally relevant than the individual fish that compose it.
Behavioral evidence is just very weak, and the specific checkbox approach that RP took also doesn’t seem to me like it makes much sense even if you want to go down the behavioral route (in-particular in my opinion you really want to gather behavioral evidence to evalute how much stuff there is going on in the brain of whatever you are looking at, like whether you have complicated social models and long-term goals and other things).
At least in my theory of mind it is clear that you need to understand what is going on inside of a mind to get strong evidence.
in-particular in my opinion you really want to gather behavioral evidence to evalute how much stuff there is going on in the brain of whatever you are looking at, like whether you have complicated social models and long-term goals and other things)
I agree strongly with both of the above points—we should be supplementing the behavioural picture by examining which functional brain regions are involved and whether these functional brain regions bear similarities with regions we know to be associated with consciousness in humans (e.g. the pallium in birds bears functional similarity with the human cortex).
Your original comment calls out that neuron counts are “not great” as a proxy but I think a more suitable proxy would be something like functional similarity + behavioural evidence.
we know to be associated with consciousness in humans
To be clear, my opinion is that we have no idea what “areas of the brain are associated with consciousness” and the whole area of research that claims otherwise is bunk.
You can’t have “strong behavioral evidence of consciousness”. At least in my theory of mind it is clear that you need to understand what is going on inside of a mind to get strong evidence.
Like, modern video game characters (without any use of AI) would also check a huge number of these “behavioral evidence” checkboxes, and really very obviously aren’t conscious or moral patients of non-negligible weight.
You also have subdivision issues. Like, by this logic you end up with thinking that a swarm of fish is less morally relevant than the individual fish that compose it.
Behavioral evidence is just very weak, and the specific checkbox approach that RP took also doesn’t seem to me like it makes much sense even if you want to go down the behavioral route (in-particular in my opinion you really want to gather behavioral evidence to evalute how much stuff there is going on in the brain of whatever you are looking at, like whether you have complicated social models and long-term goals and other things).
I agree strongly with both of the above points—we should be supplementing the behavioural picture by examining which functional brain regions are involved and whether these functional brain regions bear similarities with regions we know to be associated with consciousness in humans (e.g. the pallium in birds bears functional similarity with the human cortex).
Your original comment calls out that neuron counts are “not great” as a proxy but I think a more suitable proxy would be something like functional similarity + behavioural evidence.
(Also edited the original comment with citatons)
To be clear, my opinion is that we have no idea what “areas of the brain are associated with consciousness” and the whole area of research that claims otherwise is bunk.