Feels to me like at the moment, the “character layer” handles transforming a “you” into an “I”. I find it conspicuously absent in smaller models running locally, though maybe the absence is just more obvious with CoT than without it.
I’ve also noticed that the “training data” we get as humans is primarily focusing on or contextualized relative to ourselves, whereas the parts you’re referring to as ground layers don’t really have a concept of the LLM as an entity so they tend to assume they’re humans on that level.
Feels to me like at the moment, the “character layer” handles transforming a “you” into an “I”. I find it conspicuously absent in smaller models running locally, though maybe the absence is just more obvious with CoT than without it.
I’ve also noticed that the “training data” we get as humans is primarily focusing on or contextualized relative to ourselves, whereas the parts you’re referring to as ground layers don’t really have a concept of the LLM as an entity so they tend to assume they’re humans on that level.