it’s having the capacity to model oneself and one’s perceptions of the world.
That’s not the definition that seems to be used in many of the discussions about consciousness. For instance, the term “Hard Problem of Consciousness” isn’t talking about self-modeling.
Let’s take the discussion about p-zombies as an example. P-zombies are physically identical to normal humans, so they (that is, their brains) clearly model themselves and their own perceptions of the world. Then the claim that they are unconscious is in direct contradiction to the definition of consciousness.
If proving that p-zombies are logically impossible was as simple as pointing this out, the whole debate wouldn’t exist.
Beyond that example, I’ve gone through all LW posts that have “conscious” in their title:
I’m saying that “But you haven’t explained consciousness!” doesn’t reasonably seem like the responsibility of physicists, or an objection to a theory of fundamental physics.
And then he says:
however consciousness turns out to work, getting infected with virus X97 eventually causes your experience of dripping green slime.
I read that as using ‘consciousness’ to mean experience in the sense of subjective qualia.
Framing Consciousness. cousin_it has retracted the post, but apparently not for reasons relevant to us here. It talks about “conscious/subjective experiences”, and asks whether consciousness can be implemented on a Turing machine. Again, it’s clear that a system that recursively models itself can be implemented on a TM, so that can’t be what’s being discussed.
If p-zombies are impossible, which they are, there are no “pure subjective” experiences: any entity’s subjective experience corresponds to some objective feature of its brain or programming.
The reason “subjective experience” is called subjective is that it’s presumed not to be part of the objective, material world. That definition is dated now, of course.
I don’t want to turn this thread into a discussion of what consciousness is, or what subjective experience is. That’s a discussion I’d be very interested in, but it should be separate. My original question was, what do people mean by “consciousness”? If I understood you correctly, that to you it simply means self-modeling systems, then I was right to think different people use the C-word to mean quite different things, even just here on LW.
Lets say you’re having a subjective experience. Say, being stung by a wasp. How do you know? Right. You have to have to be a ware of yourself, and your skin, and have pain receptors, and blah blah blah.
But if you couldn’t feel the pain, let’s say because you were numb, you would still feel conscious. And if you were infected with a virus that made a wasp sting feel sugary and purple, rather than itchy and painful, you would also still be conscious.
It’s only when you don’t have a model of yourself that consciousness becomes impossible.
It’s only when you don’t have a model of yourself that consciousness becomes impossible.
That doesn’t mean they’re the same thing. Unless you define them to mean the same thing. But as I described above, not everyone does that. There is no “Hard Problem of Modeling Yourself”.
That’s not the definition that seems to be used in many of the discussions about consciousness. For instance, the term “Hard Problem of Consciousness” isn’t talking about self-modeling.
Let’s take the discussion about p-zombies as an example. P-zombies are physically identical to normal humans, so they (that is, their brains) clearly model themselves and their own perceptions of the world. Then the claim that they are unconscious is in direct contradiction to the definition of consciousness.
If proving that p-zombies are logically impossible was as simple as pointing this out, the whole debate wouldn’t exist.
Beyond that example, I’ve gone through all LW posts that have “conscious” in their title:
The Conscious Sorites Paradox, part of Eliezer’s series on quantum physics. He says:
And then he says:
I read that as using ‘consciousness’ to mean experience in the sense of subjective qualia.
Framing Consciousness. cousin_it has retracted the post, but apparently not for reasons relevant to us here. It talks about “conscious/subjective experiences”, and asks whether consciousness can be implemented on a Turing machine. Again, it’s clear that a system that recursively models itself can be implemented on a TM, so that can’t be what’s being discussed.
MWI, weird quantum experiments and future-directed continuity of conscious experience. Clearly uses “consciousness” to mean “subjective experience”.
Consciousness. Ditto.
Outline of a lower bound for consciousness. I don’t understand this post at first sight—would have to read it more throughly...
The reason “subjective experience” is called subjective is that it’s presumed not to be part of the objective, material world. That definition is dated now, of course.
I don’t want to turn this thread into a discussion of what consciousness is, or what subjective experience is. That’s a discussion I’d be very interested in, but it should be separate. My original question was, what do people mean by “consciousness”? If I understood you correctly, that to you it simply means self-modeling systems, then I was right to think different people use the C-word to mean quite different things, even just here on LW.
Lets say you’re having a subjective experience. Say, being stung by a wasp. How do you know? Right. You have to have to be a ware of yourself, and your skin, and have pain receptors, and blah blah blah.
But if you couldn’t feel the pain, let’s say because you were numb, you would still feel conscious. And if you were infected with a virus that made a wasp sting feel sugary and purple, rather than itchy and painful, you would also still be conscious.
It’s only when you don’t have a model of yourself that consciousness becomes impossible.
That doesn’t mean they’re the same thing. Unless you define them to mean the same thing. But as I described above, not everyone does that. There is no “Hard Problem of Modeling Yourself”.