Consciousness being defined as a system with an “internal observer” is problematic first because of an infinite regress if you mean an observer is something with subjective experience which would require it to have an internal observer. I think the entire premise here is too dualist, and it presupposes subjectivity as a prerequisite to itself.
Neuroscience hints that a structure analogous to an internal observer doesn’t seem to exist, and it’s still a mystery how we feel a unified stream of consciousness (the binding problem).
Your definition of self-awareness isn’t very detailed, but there is no reason to think animals are less self aware than us based on any definition I can think of. (Especially mammals). I think people searching “when will computers become self aware”, are wondering when computers will have the common experience we have of being aware of our own awareness. There’s no way to test this, we can’t even test it on people, we just believe they are like us.
With sentience vs. consciousness, this is a common point of view, which is that consciousness can exist without sentience. However, it’s clear that what people mean by consciousness when discussing “ the hard problem “, philosophical zombies, self awareness… that they are specifically talking about the ability to experience phenomena. It is conceivable for a conscious being to have no preferences nor feel pleasure or pain, but a conscious being which does not experience qualities of phenomena would not be considered conscious, it would be a zombie.
I think Freud makes a very convincing point in “The Unconscious” when talking about the problem of unconscious emotions, and he makes the point that feeling is the very essence of consciousness.[ He was explaining how all feelings are conscious by definition, but I think it’s a good case for all consciousnesses to feel. ]. I’d argue that sentience does not equal affect but it is a prerequisite. Emotions are the attribution of feelings (like heat, sharpness, vibration), with preferences (relative to our self representation), associated with the bodily feelings caused by hormonal and motor changes induced by any intention to move or relax…. The qualities for emotion are just qualities like color, light, touch, and these qualities are attributed to preferences of our self model. Feeling seems prerequisite of anything we would call conscious, but emotions and preferences require a self model [or actually a signifier for the self that other signs can relate to].
I’m not sure where I was going with this but let me know what you think of the idea that feeling is required for consciousness.
Consciousness being defined as a system with an “internal observer” is problematic first because of an infinite regress if you mean an observer is something with subjective experience which would require it to have an internal observer. I think the entire premise here is too dualist, and it presupposes subjectivity as a prerequisite to itself.
Neuroscience hints that a structure analogous to an internal observer doesn’t seem to exist, and it’s still a mystery how we feel a unified stream of consciousness (the binding problem).
Your definition of self-awareness isn’t very detailed, but there is no reason to think animals are less self aware than us based on any definition I can think of. (Especially mammals). I think people searching “when will computers become self aware”, are wondering when computers will have the common experience we have of being aware of our own awareness. There’s no way to test this, we can’t even test it on people, we just believe they are like us.
With sentience vs. consciousness, this is a common point of view, which is that consciousness can exist without sentience. However, it’s clear that what people mean by consciousness when discussing “ the hard problem “, philosophical zombies, self awareness… that they are specifically talking about the ability to experience phenomena. It is conceivable for a conscious being to have no preferences nor feel pleasure or pain, but a conscious being which does not experience qualities of phenomena would not be considered conscious, it would be a zombie.
I think Freud makes a very convincing point in “The Unconscious” when talking about the problem of unconscious emotions, and he makes the point that feeling is the very essence of consciousness.[ He was explaining how all feelings are conscious by definition, but I think it’s a good case for all consciousnesses to feel. ]. I’d argue that sentience does not equal affect but it is a prerequisite. Emotions are the attribution of feelings (like heat, sharpness, vibration), with preferences (relative to our self representation), associated with the bodily feelings caused by hormonal and motor changes induced by any intention to move or relax…. The qualities for emotion are just qualities like color, light, touch, and these qualities are attributed to preferences of our self model. Feeling seems prerequisite of anything we would call conscious, but emotions and preferences require a self model [or actually a signifier for the self that other signs can relate to].
I’m not sure where I was going with this but let me know what you think of the idea that feeling is required for consciousness.