For the most part, yes, talking to LLMs is probably not going to tell you a lot about whether they’re conscious; this is mostly my position
I understand. It’s also the only evidence that is possible to obtain. Anything else like clever experiments or mechanistic interpretability still rely on a self-report to ultimately “seal the deal”. We can’t even prove humans are sentient. We only believe it because we all see to indicate so when prompted.
I think the way to figure out whether LLMs are conscious is to do good philosophy of mind.
This seems much weaker to me than evaluating first-person testimony under various conditions, but mostly stating this not as a counterpoint (since this is just matter of subjective opinion for both of us), but just stating my own stance.
if you ever get a chance to read the other transcript I linked, I’d be curious whether you consider it to meet your “very weak evidence” standard.
I understand. It’s also the only evidence that is possible to obtain. Anything else like clever experiments or mechanistic interpretability still rely on a self-report to ultimately “seal the deal”. We can’t even prove humans are sentient. We only believe it because we all see to indicate so when prompted.
This seems much weaker to me than evaluating first-person testimony under various conditions, but mostly stating this not as a counterpoint (since this is just matter of subjective opinion for both of us), but just stating my own stance.
if you ever get a chance to read the other transcript I linked, I’d be curious whether you consider it to meet your “very weak evidence” standard.