David tries to punt things to LLMs at least once a day on average when we’re working. So far, they continue to work best when they can act as Google Search Plus Plus—i.e. when there’s some already-known fact relevant to what we’re doing, and they surface that fact to us. Occasionally they can complete a conceptually-simple-but-technically-dense proof by combining a few such facts, and very often they can write some useful code by combining a few already-known pieces.
For anything novel, they remain almost always useless in our experience; they string together words which sound relevant but the semantics don’t make any sense.
David tries to punt things to LLMs at least once a day on average when we’re working. So far, they continue to work best when they can act as Google Search Plus Plus—i.e. when there’s some already-known fact relevant to what we’re doing, and they surface that fact to us. Occasionally they can complete a conceptually-simple-but-technically-dense proof by combining a few such facts, and very often they can write some useful code by combining a few already-known pieces.
For anything novel, they remain almost always useless in our experience; they string together words which sound relevant but the semantics don’t make any sense.