That said, there’s a reason we have a word for “fiction”. Physical reality is special and different from all the other worlds; it can “push back against us” in a way that other worlds can’t.
I frame this the other way around. Fiction is the special thing. Living in the world is what’s “normal”.
I say this because that’s how most minds seem to work. Dogs, as best we can tell, don’t really have imagination and the ability to consider hypotheticals. Neither even do many smart animals, or at least not consitently.
Humans are perhaps unique in our ability to consider fictional stories, and some humans struggle to make sense of hypotheticals, which means they struggle to see the distinction between fiction and not. The ability to consider the fictional and see it as fictional is a special skill that sets people apart from animals and in many cases lets some people excel at tasks others struggle with.
LLMs are interesting in that they are perhaps uniquely minds that are all fiction. That is, I’m not sure that LLMs can actually tell fiction and reality apart (in the same way we can) because I’m not sure they’ve solved the symbol grounding problem, and instead are just so good at manipulating symbols and have such a rich set of training data that having not solved symbol grounding doesn’t really matter because they end up lining up with reality anyway.
To the extent they can tell fiction apart from non-fiction, it’s because we humans make this distinction, but I wouldn’t expect non-fiction to feel different than fiction to an LLM: it’s just another frame used for understanding what tokens to generate, but internally probably doesn’t look much different than fiction to them other than fictional worlds, for some reason, have much less training data and overlap a lot with the non-fiction world.
But this is maybe not that interesting. If everything’s fiction or non-fiction to a mind there’s no need to tell the two things apart.
(None of this is to say that LLMs can’t tell what’s fiction when asked. I’m making a claim about whether an LLM might experience fiction as different from non-fiction the way humans do, and suggesting that they probably don’t.)
Do you think they’re actually struggling to distinguish real from fiction, or merely struggling to keep two complex distinct worlds in their working memory/stack/context window and keep the details straight?
E.g. many animals will play and chase, understanding both that there’s different rules because it’s play, yet still transfer the skills to actually hunting or fighting. Seems more a matter of degree?
I think they simply lack the mental machinery to think about hypotheticals in general. So there’s no struggle really, because they have no conception of fiction.
It is a matter of degrees, though. Some animals show the ability for social deception. That’s a limited form of fictional thinking, and is probably the basis from which humans developed the skill (because chimps can also do some amount of social deception).
As for play fighting, I think this is better understood as a different behavioral mode. It doesn’t actually require conceptualization, just an ability to engage in a ritualized behavior with others that may be similar to but is safely different from real fighting. Also don’t forget that play fights sometimes accidentally become real fights!
I frame this the other way around. Fiction is the special thing. Living in the world is what’s “normal”.
I say this because that’s how most minds seem to work. Dogs, as best we can tell, don’t really have imagination and the ability to consider hypotheticals. Neither even do many smart animals, or at least not consitently.
Humans are perhaps unique in our ability to consider fictional stories, and some humans struggle to make sense of hypotheticals, which means they struggle to see the distinction between fiction and not. The ability to consider the fictional and see it as fictional is a special skill that sets people apart from animals and in many cases lets some people excel at tasks others struggle with.
One more thought to add here about AI minds:
LLMs are interesting in that they are perhaps uniquely minds that are all fiction. That is, I’m not sure that LLMs can actually tell fiction and reality apart (in the same way we can) because I’m not sure they’ve solved the symbol grounding problem, and instead are just so good at manipulating symbols and have such a rich set of training data that having not solved symbol grounding doesn’t really matter because they end up lining up with reality anyway.
To the extent they can tell fiction apart from non-fiction, it’s because we humans make this distinction, but I wouldn’t expect non-fiction to feel different than fiction to an LLM: it’s just another frame used for understanding what tokens to generate, but internally probably doesn’t look much different than fiction to them other than fictional worlds, for some reason, have much less training data and overlap a lot with the non-fiction world.
But this is maybe not that interesting. If everything’s fiction or non-fiction to a mind there’s no need to tell the two things apart.
(None of this is to say that LLMs can’t tell what’s fiction when asked. I’m making a claim about whether an LLM might experience fiction as different from non-fiction the way humans do, and suggesting that they probably don’t.)
Do you think they’re actually struggling to distinguish real from fiction, or merely struggling to keep two complex distinct worlds in their working memory/stack/context window and keep the details straight?
E.g. many animals will play and chase, understanding both that there’s different rules because it’s play, yet still transfer the skills to actually hunting or fighting. Seems more a matter of degree?
I think they simply lack the mental machinery to think about hypotheticals in general. So there’s no struggle really, because they have no conception of fiction.
It is a matter of degrees, though. Some animals show the ability for social deception. That’s a limited form of fictional thinking, and is probably the basis from which humans developed the skill (because chimps can also do some amount of social deception).
As for play fighting, I think this is better understood as a different behavioral mode. It doesn’t actually require conceptualization, just an ability to engage in a ritualized behavior with others that may be similar to but is safely different from real fighting. Also don’t forget that play fights sometimes accidentally become real fights!