I’m skeptical because in addition to logical reasoning, intuitive reasoning seems pretty important. And I’m not sure if there’s a simpler representation of intuitive reasoning that a bunch of weights from a bunch of concepts to another concept.
Check out this great paper: “From Word Models to World Models: Translating from Natural Language to the Probabilistic Language of Thought” https://arxiv.org/abs/2306.12672It proposes “probabilistic programming” as a formal “Probabilistic Language of Thought” (PLoT) with precise formal Bayesian reasoning. They show in 4 domains how a large language model can convert an informal statement or chain of reasoning into a precise probabilistic program, do precise Bayesian reasoning on that, and then convert the results back into informal natural language.
I’m skeptical because in addition to logical reasoning, intuitive reasoning seems pretty important. And I’m not sure if there’s a simpler representation of intuitive reasoning that a bunch of weights from a bunch of concepts to another concept.
Check out this great paper: “From Word Models to World Models: Translating from Natural Language to the Probabilistic Language of Thought” https://arxiv.org/abs/2306.12672 It proposes “probabilistic programming” as a formal “Probabilistic Language of Thought” (PLoT) with precise formal Bayesian reasoning. They show in 4 domains how a large language model can convert an informal statement or chain of reasoning into a precise probabilistic program, do precise Bayesian reasoning on that, and then convert the results back into informal natural language.