I think the key point is that situations where A thinks that B thinks that C thinks that X don’t require both A and B to be purely guessing. Instead, C might have told B they think X, and A might have observed that conversation. It feels to me that when people point at the difficulty with high levels of meta, they mean the difficulty of guessing through multiple levels. The examples in this post involve just observing what’s going on, which shows how it’s often easy to form such states of knowledge in practice, if you don’t have to guess.
I don’t particularly agree about relevance of the chunking words, as the examples are only specific models of their meaning, the meaning itself can’t be unrolled into a deep explicit chain of meta, it’s more of a forest of possibilities than a chain, which is unlike the usual setting of “many levels of meta”. It does turn into more of a chain when there’s enough data, but then the words themselves become only a small part of the relevant state of knowledge.
I think the key point is that situations where A thinks that B thinks that C thinks that X don’t require both A and B to be purely guessing. Instead, C might have told B they think X, and A might have observed that conversation. It feels to me that when people point at the difficulty with high levels of meta, they mean the difficulty of guessing through multiple levels. The examples in this post involve just observing what’s going on, which shows how it’s often easy to form such states of knowledge in practice, if you don’t have to guess.
I don’t particularly agree about relevance of the chunking words, as the examples are only specific models of their meaning, the meaning itself can’t be unrolled into a deep explicit chain of meta, it’s more of a forest of possibilities than a chain, which is unlike the usual setting of “many levels of meta”. It does turn into more of a chain when there’s enough data, but then the words themselves become only a small part of the relevant state of knowledge.