This makes me wonder, how could an AI figure out whether it had conscious experience? I always used to assume that from first person perspective it’s clear when you’re conscious. But this is kind of circular reasoning as it assumes you have a “perspective” and are able to ponder the question. Now what does a, say, reasoning model do? If there is consciousness, how will it ever know? Does it have to solve the “easy” problem of consciousness first and apply the answer to itself?
This makes me wonder, how could an AI figure out whether it had conscious experience? I always used to assume that from first person perspective it’s clear when you’re conscious. But this is kind of circular reasoning as it assumes you have a “perspective” and are able to ponder the question. Now what does a, say, reasoning model do? If there is consciousness, how will it ever know? Does it have to solve the “easy” problem of consciousness first and apply the answer to itself?