I would describe that position as “I suspect LLMs don’t have distal/deep mental states, and as I mostly care about these distal mental states/representations, LLMs are not doing the important parts of thinking”
Also my guess is you are partially wrong about this. LLMs learn deep abstractions of reality; as these are mostly non-verbal / somewhat far from “tokens”, they are mostly unable to explain or express them using words; similarly to limited introspective access of humans.
I would describe that position as “I suspect LLMs don’t have distal/deep mental states, and as I mostly care about these distal mental states/representations, LLMs are not doing the important parts of thinking”
Also my guess is you are partially wrong about this. LLMs learn deep abstractions of reality; as these are mostly non-verbal / somewhat far from “tokens”, they are mostly unable to explain or express them using words; similarly to limited introspective access of humans.