Honestly, I’ve just had to go back and forth banging my head on Friston’s free-energy papers, non-Friston free-energy papers, and the ordinary variational inference literature—for the past two years, prior to which I spent three years banging my head on the Josh Tenenbaum-y computational cog-sci literature and got used to seeing probabilistic models of cognition.
I’m now really fucking glad to be in a PhD program where I can actually use that knowledge.
Oh, and btw, everyone at MIRI was exactly as confused as Scott is when I presented a bunch of free-energy stuff to them last March.
Bingo. Friston trained as a physicist, and he wants the free-energy principle to be more like a physical law than a computer program. You can write basically any computer program that implements or supports variational inference, throw in some action states as variational parameters, and you’ve “implemented” the free-energy principle _in some way_.
Overall, the Principle is more of a domain-specific language than a single unified model, more like “supervised learning” than like “this 6-layer convnet I trained for neural style transfer.”
No. They’re isomorphic, via the Complete Class Theorem. Any utility/cost function that grows sub-super-exponentially (ie: for which Pascal’s Mugging doesn’t happen) can be expressed as a distribution, and used in the free-energy principle. You can get the intuition by thinking, “This goal specifies how often I want to see outcome X (P), versus its disjoint cousins Y and Z that I want to see such-or-so often (1-P).”
The is actually one of the Very Good things about free-energy models: since free-energy is “Energy—Entropy”, or “Exploit + Explore”, cast in the same units (bits/nats from info theory), it theorizes a principled, prescriptive way to make the tradeoff, once you’ve specified how concentrated the probability mass is under the goals in the support set (and thus the multiplicative inverse of the exploit term’s global optimum).
We ought to be able to use this to test the Principle empirically, I think.
(EDIT: Dear God, why was everything bold!?)