if we needed correct internal computations for qualia (and not just correct behavior) would mean the overall system would falsely believe to have a quale (like being in pain), it would act, in all ways, like it was in pain, but actually, it wouldn’t be in pain.
To all appearances LLMs already do that and have for several years now. So, yes, that is clearly possible for a non-conscious thing to do.
Your definition of qualia is nonstandard, and defines it out of meaningfulness. More standard definitions generally include at least one synonym for ‘ineffable’ and I believe them to be entirely mysterious answers to mysterious questions.
To all appearances LLMs already do that and have for several years now.
LLMs can be (incorrectly) argued to have no qualia, and therefore no beliefs in the sense that my hypothetical uses. (In my hypothetical, the rest of the agent remains intact, and qualia-believes himself to have the quale of pain, even though he doesn’t.)
(I’m also noting you said nothing about my three other reasons, which is completely understandable, yet something I think you should think about.)
To all appearances LLMs already do that and have for several years now. So, yes, that is clearly possible for a non-conscious thing to do.
Your definition of qualia is nonstandard, and defines it out of meaningfulness. More standard definitions generally include at least one synonym for ‘ineffable’ and I believe them to be entirely mysterious answers to mysterious questions.
LLMs can be (incorrectly) argued to have no qualia, and therefore no beliefs in the sense that my hypothetical uses. (In my hypothetical, the rest of the agent remains intact, and qualia-believes himself to have the quale of pain, even though he doesn’t.)
(I’m also noting you said nothing about my three other reasons, which is completely understandable, yet something I think you should think about.)
Do you mean meaninglessness?