In the board game Guess Who the maximally informative bit to get from your opponent is one that cuts the remaining search space in half. IE construct a set of queries that turns the candidates into a binary tree. I think of the connotation space of words the same way. The space of meaning is super high dimensional. One of the way to cut the space down quickly is to use lots of contrasting opposites built into words and language patterns.
I like this framing, especially as it gracefully handles the way that communication isn’t like Guess Who—you have priors that don’t look like “uniform over the following N possibilities”, your payoffs for actually finding the answer might be nonconstant depend on what the answer is, some resource limitation might make the maxi-p(win) strategy different from the optional discriminator—but once you start thinking about how you’d win a game with those rules, strategies for smarter search suggest themselves.
In the board game Guess Who the maximally informative bit to get from your opponent is one that cuts the remaining search space in half. IE construct a set of queries that turns the candidates into a binary tree. I think of the connotation space of words the same way. The space of meaning is super high dimensional. One of the way to cut the space down quickly is to use lots of contrasting opposites built into words and language patterns.
I like this framing, especially as it gracefully handles the way that communication isn’t like Guess Who—you have priors that don’t look like “uniform over the following N possibilities”, your payoffs for actually finding the answer might be nonconstant depend on what the answer is, some resource limitation might make the maxi-p(win) strategy different from the optional discriminator—but once you start thinking about how you’d win a game with those rules, strategies for smarter search suggest themselves.