As a reductive materialist, expecting to find a materialistic explanation for consciousness, in your model I’d be Camp 2. And yet in the dialogue
“It’s obvious that consciousness exists.”
-Yes, it sure looks like the brain is doing a lot of non-parallel processing that involves several spatially distributed brain areas at once, so-
“I’m not just talking about the computational process. I mean qualia obviously exists.”
-Define qualia.
“You can’t define qualia; it’s a primitive. But you know what I mean.”
-I don’t. How could I if you can’t define it?
“I mean that there clearly is some non-material experience stuff!”
-Non-material, as in defying the laws of physics? In that case, I do get it, and I super don’t-
“It’s perfectly compatible with the laws of physics.”
-Then I don’t know what you mean.
“I mean that there’s clearly some experiential stuff accompanying the physical process.”
-I don’t know what that means.
I much more relate to the position expressed via bold text. Because the Camp 2 person here smuggles the assumption that qualia are non-material and experience stuff is separate from the physical process. Some definitions of consciousness/experience/qualia are incoherent and we may not even know about it yet because we do not know all the related physics. Doing Socratic method here and refusing to accept vague handwaving aka “you know what I mean” is a valid strategy to get rid of the confusion around the topic.
People are confused about consciousness due to their initial intuitions, and then they find full fledged philosophies built around this confusion, created long before we made substantial progress in neuroscience. Thus people get validation of their confusion and may form their identities around it, instead of trying to resolve it. And for these historic reasons we have completely different narratives around consciousness, not unllike political ones among those, who try to resolve their confusion, and those who don’t.
According to Camp #1, the correct explanandum is still “I claim to have experienced X”
I wonder, can we mend the rift by introducing a bit of recursion. Consider:
I experience to have experienced X.
Now I can make statements like:
My experience of experience is different from my experience of matter
Without smuggling the assumption that experience is indeed different from matter. It’s curious how many philosophical problems can be deconfused by the propper understanding of map-territory framework.
Thanks for that comment. Can you explain why you think you’re Camp #2 according to the post? Because based on this reply, you seem firmly (in fact, quite obviously) in Camp #1 to me, so there must be some part of the post where I communicated very poorly.
( … guessing for the reason here …) I wrote in the second-last section that consciousness, according to Camp #1, has fuzzy boundaries. But that just means that the definition of the phenomenon has fuzzy boundaries, meaning that it’s unclear when consciousness would stop being consciousness if you changed the architecture slightly (or built an AI with similar architecture). I definitely didn’t mean to say that there’s fuzziness in how the human brain produces consciousness; I think Camp #1 would overwhelmingly hold that we can, in principle, find a full explanation that precisely maps out the role of every last neuron.
At first I also thought that I’m a central example of Camp 1 based on the general vibes but then I reread the descriptions. I’ve boldened the things that I agree with in both of them
Camp #1 tends to think of consciousness as a non-special high-level phenomenon. Solving consciousness is then tantamount to solving the Meta-Problem of consciousness, which is to explain why we think/claim to have consciousness. In other words, once we’ve explained why people keep uttering the sounds kon-shush-nuhs, we’ve explained all the hard observable facts, and the idea that there’s anything else seems dangerously speculative/unscientific. No complicated metaphysics is required for this approach.
Conversely, Camp #2 is convinced that there is an experience thing that exists in a fundamental way. There’s no agreement on what this thing is – theories range anywhere from hardcore physicalist accounts to substance dualists that postulate causally active non-material stuff – but they all agree that there is something that needs explaining. Also, getting your metaphysics right is probably a part of making progress.
I do not think that explaining why people talk about consciousness is the same as explaining what consciousness is. People talk about “consciousness” because they possess some mental property that they call “consciousness”. What exactly this is is still an open problem. I expect to find something like a specific encoding that my brain uses to translate signals from my body to the interface that the central planning agent interacts with. And while I agree that no complicated metaphysics is required, discarding metaphysics still counsts as getting it exactly right. I do not think that consciousness is fundamental but as you’ve included hardcore physicalist accounts into Camp 2 - I’m definetely Camp 2.
Okay, that makes a lot of sense. I’m still pretty sure that you’re a central example of what I meant by Camp #1, and that the problem was how I described them. In particular,
Solving consciousness = solving the Meta Problem: what I meant by “solving the meta problem” here entails explaining the full causal chain. So if you say “People talk about ‘consciousness’ because they possess some mental property that they call ‘consciousness’”, then this doesn’t count as a solution until you also recursively unpack what this mental property is, until you’ve reduced it to the brain’s physical implementation. So I think you agree with this claim as it was intended. The way someone might disagree is if they hold something like epiphenomenalism, where the laws of physics are not enough and additional information is required. Or, if they are physicalists, they might still hold that additional conceptual/philosophical/metaphysical work is required from our part.
hardcore physicalist accounts: I think virtually everyone in Camp #1 is a physicalist, whereas camp #2 is split. So this doesn’t put you in camp #2.
getting your metaphysics right: well, this formulation was dumb since, as you say, needing to not bring strange metaphysics into the picture is also one way of getting it right. What I meant was that the metaphysics is nontrivial.
I’ve just rewritten the descriptions of the two camps. Ideally, you should now fully identify with the first. (Edit: I also rewrote the part about consciousness being fuzzy, since I think that was poorly phrased even if it didn’t cause issues here.)
Okay, now Camp 1 feels more like home. Yet, I notice that I’m confused. How can anyone in Camp 2 be a physicalist then? Can you give me an example?
So if you say “People talk about ‘consciousness’ because they possess some mental property that they call ‘consciousness’”, then this doesn’t count as a solution until you also recursively unpack what this mental property is, until you’ve reduced it to the brain’s physical implementation.
Sounds about right. But just to be clear it doesn’t mean that “consciousness” equals “talks about consciousness”. It’s just that by explaining a bigger thing (consciousness) we will also explain the smaller one (talks about consciousness) that depends on it. I expect consciousness to be related to many other stuff and talks about it being just an obvious example of a thing that wouldn’t happen without consciousness.
I was under the impression that your camps were mostly about whether a person thinks there is a Hard Problem of Consciousness or not. But now it seems that they are more about whether the person includes idealism in some sense into their worldview? I suppose you are trying to compress both these dimensions (idealism/non-idealism, HP/non-HP) into one. And if so, I’m afraid your model is going to miss a lot of nuances.
Sounds about right. But just to be clear it doesn’t mean that “consciousness” equals “talks about consciousness”. It’s just that by explaining a bigger thing (consciousness) we will also explain the smaller one (talks about consciousness) that depends on it. I expect consciousness to be related to many other stuff and talks about it being just an obvious example of a thing that wouldn’t happen without consciousness.
Yes, this is also how I meant it. Never meant to suggest that the consciousness phenomenon doesn’t have other functional roles.
Okay, now Camp 1 feels more like home. Yet, I notice that I’m confused. How can anyone in Camp 2 be a physicalist then? Can you give me an example?
So first off, using the word physicalist in the post was very stupid since people don’t agree what it means, and the rewrite I made before my previous comment took the term out. So what I meant, and what the text now says now without the term, is “not postulating causal power in addition to the laws of physics”.
With that definition, lots of Camp #2 people are physicalists—and on LW in particular, I’d guess it’s well over 80%. Even David Chalmers is an example; consciousness doesn’t violate the laws of physics under his model, it’s just that you need additional—but non-causally-relevant—laws to determine how consciousness emerges from matter. In general, you can also just hold that consciousness is a different way to look at the same process, which is sometimes called dual-aspect monism, and that’s physicalist, too.
I was under the impression that your camps were mostly about whether a person thinks there is a Hard Problem of Consciousness or not. But now it seems that they are more about whether the person includes idealism in some sense into their worldview? I suppose you are trying to compress both these dimensions (idealism/non-idealism, HP/non-HP) into one. And if so, I’m afraid your model is going to miss a lot of nuances.
I mean, I don’t think it’s just about the hard problem; otherwise, the post wouldn’t be necessary. And I don’t think you can say it’s about idealism because people don’t agree what idealism means. Like, the post is about describing what the camps are, I don’t think I can do it better here, and I don’t think there’s a shorter description that will get everyone on board.
In general, another reason why it’s hard to talk about consciousness (which was in a previous version of this post but I cut it) is that there’s so much variance in how people think about the problem, and what they think terms mean. Way back, gwern said about LLMs that “Sampling can prove the presence of knowledge but not the absence”. The same thing is true about the clarity of concepts; discussion can prove that they’re ambiguous, but never that they’re clear. So you may talk to someone, or even to a bunch of people, and you’ll communicate perfectly, and you may think “hooray, I have a clear vocabulary, communication is easy!”. And then you talk to smn else the next day and you’re talking way past each other. And it’s especially problematic if you pre-select people who already agree with you.
Overall, I suspect the Camp #1/Camp #2 thing is the best (as in, the most consistently applicable and most informative) axis you’ll find. Which is ultimately an empirical question, and you could do polls to figure it out. I suspect asking about the hard problem is probably pretty good (but significantly worse than the camps) and asking about idealism is probably a disaster. I also think the camps get at a more deeply rooted intuition compared to the other stuff.
So what I meant, and what the text now says now without the term, is “not postulating causal power in addition to the laws of physics”.
Oh I see. Yeah, that’s an unconventianal use of “physicalism” I don’t think I’ve ever seen it before.
Using the conventional philosophical language, or at least the one supported by Wikipedia and search engines, Camp 1 maps pretty well to monist materialism aka physicalism, while Camp 2 is everything else: all kinds of metaphysical pluralism, dualism, idealism and more exotictypes ofmonism.
Anyway, then indeed, camp one is all the way for me. While I’m still a bit worried that people using such a broad definitions will miss the important nuance it’s a very good first approximation.
I think you are a bit off the mark.
As a reductive materialist, expecting to find a materialistic explanation for consciousness, in your model I’d be Camp 2. And yet in the dialogue
I much more relate to the position expressed via bold text. Because the Camp 2 person here smuggles the assumption that qualia are non-material and experience stuff is separate from the physical process. Some definitions of consciousness/experience/qualia are incoherent and we may not even know about it yet because we do not know all the related physics. Doing Socratic method here and refusing to accept vague handwaving aka “you know what I mean” is a valid strategy to get rid of the confusion around the topic.
People are confused about consciousness due to their initial intuitions, and then they find full fledged philosophies built around this confusion, created long before we made substantial progress in neuroscience. Thus people get validation of their confusion and may form their identities around it, instead of trying to resolve it. And for these historic reasons we have completely different narratives around consciousness, not unllike political ones among those, who try to resolve their confusion, and those who don’t.
I wonder, can we mend the rift by introducing a bit of recursion. Consider:
I experience to have experienced X.
Now I can make statements like:
My experience of experience is different from my experience of matter
Without smuggling the assumption that experience is indeed different from matter. It’s curious how many philosophical problems can be deconfused by the propper understanding of map-territory framework.
Thanks for that comment. Can you explain why you think you’re Camp #2 according to the post? Because based on this reply, you seem firmly (in fact, quite obviously) in Camp #1 to me, so there must be some part of the post where I communicated very poorly.
( … guessing for the reason here …) I wrote in the second-last section that consciousness, according to Camp #1, has fuzzy boundaries. But that just means that the definition of the phenomenon has fuzzy boundaries, meaning that it’s unclear when consciousness would stop being consciousness if you changed the architecture slightly (or built an AI with similar architecture). I definitely didn’t mean to say that there’s fuzziness in how the human brain produces consciousness; I think Camp #1 would overwhelmingly hold that we can, in principle, find a full explanation that precisely maps out the role of every last neuron.
Was that section the problem Or sth else?
At first I also thought that I’m a central example of Camp 1 based on the general vibes but then I reread the descriptions. I’ve boldened the things that I agree with in both of them
I do not think that explaining why people talk about consciousness is the same as explaining what consciousness is. People talk about “consciousness” because they possess some mental property that they call “consciousness”. What exactly this is is still an open problem. I expect to find something like a specific encoding that my brain uses to translate signals from my body to the interface that the central planning agent interacts with. And while I agree that no complicated metaphysics is required, discarding metaphysics still counsts as getting it exactly right. I do not think that consciousness is fundamental but as you’ve included hardcore physicalist accounts into Camp 2 - I’m definetely Camp 2.
Okay, that makes a lot of sense. I’m still pretty sure that you’re a central example of what I meant by Camp #1, and that the problem was how I described them. In particular,
Solving consciousness = solving the Meta Problem: what I meant by “solving the meta problem” here entails explaining the full causal chain. So if you say “People talk about ‘consciousness’ because they possess some mental property that they call ‘consciousness’”, then this doesn’t count as a solution until you also recursively unpack what this mental property is, until you’ve reduced it to the brain’s physical implementation. So I think you agree with this claim as it was intended. The way someone might disagree is if they hold something like epiphenomenalism, where the laws of physics are not enough and additional information is required. Or, if they are physicalists, they might still hold that additional conceptual/philosophical/metaphysical work is required from our part.
hardcore physicalist accounts: I think virtually everyone in Camp #1 is a physicalist, whereas camp #2 is split. So this doesn’t put you in camp #2.
getting your metaphysics right: well, this formulation was dumb since, as you say, needing to not bring strange metaphysics into the picture is also one way of getting it right. What I meant was that the metaphysics is nontrivial.
I’ve just rewritten the descriptions of the two camps. Ideally, you should now fully identify with the first. (Edit: I also rewrote the part about consciousness being fuzzy, since I think that was poorly phrased even if it didn’t cause issues here.)
Okay, now Camp 1 feels more like home. Yet, I notice that I’m confused. How can anyone in Camp 2 be a physicalist then? Can you give me an example?
Sounds about right. But just to be clear it doesn’t mean that “consciousness” equals “talks about consciousness”. It’s just that by explaining a bigger thing (consciousness) we will also explain the smaller one (talks about consciousness) that depends on it. I expect consciousness to be related to many other stuff and talks about it being just an obvious example of a thing that wouldn’t happen without consciousness.
I was under the impression that your camps were mostly about whether a person thinks there is a Hard Problem of Consciousness or not. But now it seems that they are more about whether the person includes idealism in some sense into their worldview? I suppose you are trying to compress both these dimensions (idealism/non-idealism, HP/non-HP) into one. And if so, I’m afraid your model is going to miss a lot of nuances.
Yes, this is also how I meant it. Never meant to suggest that the consciousness phenomenon doesn’t have other functional roles.
So first off, using the word physicalist in the post was very stupid since people don’t agree what it means, and the rewrite I made before my previous comment took the term out. So what I meant, and what the text now says now without the term, is “not postulating causal power in addition to the laws of physics”.
With that definition, lots of Camp #2 people are physicalists—and on LW in particular, I’d guess it’s well over 80%. Even David Chalmers is an example; consciousness doesn’t violate the laws of physics under his model, it’s just that you need additional—but non-causally-relevant—laws to determine how consciousness emerges from matter. In general, you can also just hold that consciousness is a different way to look at the same process, which is sometimes called dual-aspect monism, and that’s physicalist, too.
I mean, I don’t think it’s just about the hard problem; otherwise, the post wouldn’t be necessary. And I don’t think you can say it’s about idealism because people don’t agree what idealism means. Like, the post is about describing what the camps are, I don’t think I can do it better here, and I don’t think there’s a shorter description that will get everyone on board.
In general, another reason why it’s hard to talk about consciousness (which was in a previous version of this post but I cut it) is that there’s so much variance in how people think about the problem, and what they think terms mean. Way back, gwern said about LLMs that “Sampling can prove the presence of knowledge but not the absence”. The same thing is true about the clarity of concepts; discussion can prove that they’re ambiguous, but never that they’re clear. So you may talk to someone, or even to a bunch of people, and you’ll communicate perfectly, and you may think “hooray, I have a clear vocabulary, communication is easy!”. And then you talk to smn else the next day and you’re talking way past each other. And it’s especially problematic if you pre-select people who already agree with you.
Overall, I suspect the Camp #1/Camp #2 thing is the best (as in, the most consistently applicable and most informative) axis you’ll find. Which is ultimately an empirical question, and you could do polls to figure it out. I suspect asking about the hard problem is probably pretty good (but significantly worse than the camps) and asking about idealism is probably a disaster. I also think the camps get at a more deeply rooted intuition compared to the other stuff.
Oh I see. Yeah, that’s an unconventianal use of “physicalism” I don’t think I’ve ever seen it before.
Using the conventional philosophical language, or at least the one supported by Wikipedia and search engines, Camp 1 maps pretty well to monist materialism aka physicalism, while Camp 2 is everything else: all kinds of metaphysical pluralism, dualism, idealism and more exotic types of monism.
Anyway, then indeed, camp one is all the way for me. While I’m still a bit worried that people using such a broad definitions will miss the important nuance it’s a very good first approximation.