I made a little 20-questions style quiz intended to help people discover their stance on whether some “X” is conscious, given my frustration with unclear debates especially about AI consciousness: https://consciousness-quiz.netlify.app/
Intention is to showcase the diversity of positions and help you reflect on your own. Includes reading links for each position.
Nice! I would like to see a visual showing the full decision tree. I think that would be even better for clarifying the different views of consciousness.
I love this. I am similarly frustrated by how poorly consciousness discussions often go. The error I see most common is that when laypeople bring up consciousness, they’re really talking about something like metacognition (i.e., whether the reasoner can correctly identify itself and its reasoning process). Then, when people in the know bring up qualia, laypeople get confused.
I would add a button at the beginning labeled, “What’s consciousness?” so that people are responding to the quiz with your preferred definition. Since you’re clearly a Philosophy of Mind guy, I assume you mean something like “A first person internal experience/feeling that coincides with external stimulus.” You could throw in a definition of qualia, examples, and maybe Nagel’s position that to be conscious means there’s something it’s like to be that thing, rocks (unconscious) versus bats (probably conscious).
I’m glad you like the quiz! As for your suggestion, one of the purposes of the entire endeavor was to show how divergent different definitions of consciousness are. So, in a sense, “what’s consciousness?” is what you end up finding out once you’ve completed the quiz, at least in terms of how you personally conceive of it.
My personal stance is actually somewhere close to “consciousness is incoherent as a concept” (or else a Hegelian stance), even though I’m familiar with some of the more contemporary Phil of Mind literature on the topic.
The word consciousness inherently conflates metacognition and capacity for qualia. It is a word that refers to the combination of those things. Similar to the way “AGI” (at this point) carries a presumption of a link between human-level cognition and recursive self-improvement, only unlike that case there was never actually a basis for thinking that the components of consciousness would be intrinsically linked. Systems that can experience qualia but can’t tell us about it have been assumed to lack qualia for no actual reason.
When you’re trying to talk to anyone about consciousness, this is usually the first thing you have to work on.
I personally think it’s useful to keep metacognition and consciousness separate as far as concepts go. This is generally the approach in philosophy of mind (e.g., Searle, Nagel, Chalmers). Blending the concepts obfuscates what’s interesting about metacognition and what’s interesting about consciousness.
So in my view, AI clearly excels at metacognition, but it’s an open question whether it’s conscious. Human babies are very likely conscious, but lack any metacognition.
Consciousness is useful apart from metacognition because consciousness is, by my account, a required feature of moral consideration. It’s a prerequisite to the qualia that is “pain”. Since I think animals and babies are conscious and can feel pain, they automatically receive moral consideration in my book.
Testing for consciousness is a fraught and likely impossible task, but I don’t think that means we shouldn’t have a word for it or that we should intermingle it with the concept of metacognition. AI very well may be conscious and perform metacognition, or it may be unconscious and yet still perform metacognition.
One should never refer to consciousness∖metacognition as ‘consciousness’ because of how strong the conflation of those things are under the common sense of the word, and I believe that there isn’t a need to salvage ‘consciousness’, we have better words for this exclusion, you can call it observer-measure, indexical prior, subjectivity, or experiencingness. Philosophers will recognise ‘subjectivity’ or ‘experiencingness’, but the other names are new, they are a result of the bayesian school being much better at thinking about this kind of thing than academic philosophers have been to the extent that I don’t see a reason to sacrifice clarity to stay in dialog with them.
I misunderstood your original point, and I am completely fine with using words like “subjectivity” and “experiencingness” for the sake of clarity. Perhaps those words should be used in the quiz if the original poster intended to use that definition. The original poster was frustrated by the lack of clarity in consciousness discussions, and I think definitions are (partially) to blame.
In the “all positions” page, why is the second sentence of most summaries referring to a “detail” or “full description”? I see no way to access anything like that
I made a little 20-questions style quiz intended to help people discover their stance on whether some “X” is conscious, given my frustration with unclear debates especially about AI consciousness: https://consciousness-quiz.netlify.app/
Intention is to showcase the diversity of positions and help you reflect on your own. Includes reading links for each position.
Nice! I would like to see a visual showing the full decision tree. I think that would be even better for clarifying the different views of consciousness.
I love this. I am similarly frustrated by how poorly consciousness discussions often go. The error I see most common is that when laypeople bring up consciousness, they’re really talking about something like metacognition (i.e., whether the reasoner can correctly identify itself and its reasoning process). Then, when people in the know bring up qualia, laypeople get confused.
I would add a button at the beginning labeled, “What’s consciousness?” so that people are responding to the quiz with your preferred definition. Since you’re clearly a Philosophy of Mind guy, I assume you mean something like “A first person internal experience/feeling that coincides with external stimulus.” You could throw in a definition of qualia, examples, and maybe Nagel’s position that to be conscious means there’s something it’s like to be that thing, rocks (unconscious) versus bats (probably conscious).
I’m glad you like the quiz! As for your suggestion, one of the purposes of the entire endeavor was to show how divergent different definitions of consciousness are. So, in a sense, “what’s consciousness?” is what you end up finding out once you’ve completed the quiz, at least in terms of how you personally conceive of it.
My personal stance is actually somewhere close to “consciousness is incoherent as a concept” (or else a Hegelian stance), even though I’m familiar with some of the more contemporary Phil of Mind literature on the topic.
The word consciousness inherently conflates metacognition and capacity for qualia. It is a word that refers to the combination of those things. Similar to the way “AGI” (at this point) carries a presumption of a link between human-level cognition and recursive self-improvement, only unlike that case there was never actually a basis for thinking that the components of consciousness would be intrinsically linked. Systems that can experience qualia but can’t tell us about it have been assumed to lack qualia for no actual reason.
When you’re trying to talk to anyone about consciousness, this is usually the first thing you have to work on.
I personally think it’s useful to keep metacognition and consciousness separate as far as concepts go. This is generally the approach in philosophy of mind (e.g., Searle, Nagel, Chalmers). Blending the concepts obfuscates what’s interesting about metacognition and what’s interesting about consciousness.
So in my view, AI clearly excels at metacognition, but it’s an open question whether it’s conscious. Human babies are very likely conscious, but lack any metacognition.
Consciousness is useful apart from metacognition because consciousness is, by my account, a required feature of moral consideration. It’s a prerequisite to the qualia that is “pain”. Since I think animals and babies are conscious and can feel pain, they automatically receive moral consideration in my book.
Testing for consciousness is a fraught and likely impossible task, but I don’t think that means we shouldn’t have a word for it or that we should intermingle it with the concept of metacognition. AI very well may be conscious and perform metacognition, or it may be unconscious and yet still perform metacognition.
One should never refer to consciousness∖metacognition as ‘consciousness’ because of how strong the conflation of those things are under the common sense of the word, and I believe that there isn’t a need to salvage ‘consciousness’, we have better words for this exclusion, you can call it observer-measure, indexical prior, subjectivity, or experiencingness. Philosophers will recognise ‘subjectivity’ or ‘experiencingness’, but the other names are new, they are a result of the bayesian school being much better at thinking about this kind of thing than academic philosophers have been to the extent that I don’t see a reason to sacrifice clarity to stay in dialog with them.
I misunderstood your original point, and I am completely fine with using words like “subjectivity” and “experiencingness” for the sake of clarity. Perhaps those words should be used in the quiz if the original poster intended to use that definition. The original poster was frustrated by the lack of clarity in consciousness discussions, and I think definitions are (partially) to blame.
In the “all positions” page, why is the second sentence of most summaries referring to a “detail” or “full description”? I see no way to access anything like that
That’s weird—I think some of the descriptions might have gotten cut off or accidentally summarized. Thanks for letting me know, I’ll clean it up!
EDIT: should be cleaned up now! Thanks again.