Basic elements of conscious experience is what people mean by qualia.
An example is the feeling of pain or the perception of redness. If you know what it feels like to be in pain, or what red looks like, that is what is meant by qualia.
Given what people mean by qualia and consciousness, you can either believe in both, or disbelieve in both, but if you believe in subjective experience but disbelieve in qualia, you’re using words differently from everybody else.
(The Sequences explain that a microphysical duplicate of our body would also contain causes of us talking about qualia, which means it would also contain our qualia, which makes zombies metaphysically impossible.)
and then never explain why you believe this to be true or defend it in any way, even when challenged
I don’t think I was challenged to explain why I believe that (even though I was challenged about other things).
One other reason would be that we can imagine replacing an entire part of the brain by an I/O equivalent, but computationally non-isomorphic system, which, if we needed correct internal computations for qualia (and not just correct behavior) would mean the overall system would falsely believe to have a quale (like being in pain), it would act, in all ways, like it was in pain, but actually, it wouldn’t be in pain.
if we needed correct internal computations for qualia (and not just correct behavior) would mean the overall system would falsely believe to have a quale (like being in pain), it would act, in all ways, like it was in pain, but actually, it wouldn’t be in pain.
To all appearances LLMs already do that and have for several years now. So, yes, that is clearly possible for a non-conscious thing to do.
Your definition of qualia is nonstandard, and defines it out of meaningfulness. More standard definitions generally include at least one synonym for ‘ineffable’ and I believe them to be entirely mysterious answers to mysterious questions.
To all appearances LLMs already do that and have for several years now.
LLMs can be (incorrectly) argued to have no qualia, and therefore no beliefs in the sense that my hypothetical uses. (In my hypothetical, the rest of the agent remains intact, and qualia-believes himself to have the quale of pain, even though he doesn’t.)
(I’m also noting you said nothing about my three other reasons, which is completely understandable, yet something I think you should think about.)
Basic elements of conscious experience is what people mean by qualia.
An example is the feeling of pain or the perception of redness. If you know what it feels like to be in pain, or what red looks like, that is what is meant by qualia.
Given what people mean by qualia and consciousness, you can either believe in both, or disbelieve in both, but if you believe in subjective experience but disbelieve in qualia, you’re using words differently from everybody else.
(The Sequences explain that a microphysical duplicate of our body would also contain causes of us talking about qualia, which means it would also contain our qualia, which makes zombies metaphysically impossible.)
I don’t think I was challenged to explain why I believe that (even though I was challenged about other things).
Some reasons I believe it are in this comment.
One other reason would be that we can imagine replacing an entire part of the brain by an I/O equivalent, but computationally non-isomorphic system, which, if we needed correct internal computations for qualia (and not just correct behavior) would mean the overall system would falsely believe to have a quale (like being in pain), it would act, in all ways, like it was in pain, but actually, it wouldn’t be in pain.
To all appearances LLMs already do that and have for several years now. So, yes, that is clearly possible for a non-conscious thing to do.
Your definition of qualia is nonstandard, and defines it out of meaningfulness. More standard definitions generally include at least one synonym for ‘ineffable’ and I believe them to be entirely mysterious answers to mysterious questions.
LLMs can be (incorrectly) argued to have no qualia, and therefore no beliefs in the sense that my hypothetical uses. (In my hypothetical, the rest of the agent remains intact, and qualia-believes himself to have the quale of pain, even though he doesn’t.)
(I’m also noting you said nothing about my three other reasons, which is completely understandable, yet something I think you should think about.)
Do you mean meaninglessness?