“Sentient” is used to mean “some aspect of consciousness which gives its possessor some level of moral patienthood”, without specifying which aspect of consciousness or what kind of moral patienthood, or how they are related. So it’s a technical-looking term, which straddles to poorly understaood areas, and has no precise meaning. So it’s generally misleading and better tabood.
I don’t think ‘tautology’ fits. There are some people who would draw the line somewhere else even if they were convinced of sentience.
Some people might be convinced that only humans should be included, or maybe biological beings, or some other category of entities that is not fully defined by mental properties. I guess ‘moral patient’ is kind of equivalent to ‘sentient’ but I think this mostly tells us something about philosophers agreeing that sentience is the proper marker for moral relevance.
I agree with your logic. I’d expand the logic in the parent post to say “whatever you care about in humans, it’s likely that animals and some AIs will have it too”. Sentience is used in several ways, and poorly defined, so doesn’t do much work on its own.
“Sentient” is used to mean “some aspect of consciousness which gives its possessor some level of moral patienthood”, without specifying which aspect of consciousness or what kind of moral patienthood, or how they are related. So it’s a technical-looking term, which straddles to poorly understaood areas, and has no precise meaning. So it’s generally misleading and better tabood.
It can’t mean that in the OP, as this definition has moral value built in, making the claim “all sentient lives matter” a tautology.
Some people use it that way. But if sentience just is moral patienthood, how do you detect it?
That is the big question. What has moral standing, and why?
I don’t think ‘tautology’ fits. There are some people who would draw the line somewhere else even if they were convinced of sentience. Some people might be convinced that only humans should be included, or maybe biological beings, or some other category of entities that is not fully defined by mental properties. I guess ‘moral patient’ is kind of equivalent to ‘sentient’ but I think this mostly tells us something about philosophers agreeing that sentience is the proper marker for moral relevance.
I agree with your logic. I’d expand the logic in the parent post to say “whatever you care about in humans, it’s likely that animals and some AIs will have it too”. Sentience is used in several ways, and poorly defined, so doesn’t do much work on its own.