TagLast edit: 29 May 2023 18:59 UTC by Roman Leventov

Generative Flow Networks or GFlowNets is a new paradigm of neural net training, developed at MILA since 2021.

GFlowNets are related to Monte-Carlo Markov chain methods (as they sample from a distribution specified by an energy function), reinforcement learning (as they learn a policy to sample composed objects through a sequence of steps), generative models (as they learn to represent and sample from a distribution) and amortized variational methods (as they can be used to learn to approximate and sample from an otherwise intractable posterior, given a prior and a likelihood). GFlowNet are trained to generate an object through a sequence of steps with probability proportional to some reward function (or with denoting the energy function), given at the end of the generative trajectory.[1]

Through generative models and variational inference, GFlowNets are also related to Active Inference.

GFlowNets promise better interpretability and more robust reasoning than the current auto-regressive LLMs[2].

  1. ^

    Pan, L., Malkin, N., Zhang, D., & Bengio, Y. (2023). Better Training of GFlowNets with Local Credit and Incomplete Trajectories (arXiv:2302.01687). arXiv. https://​​​​10.48550/​​arXiv.2302.01687

  2. ^

    Bengio, Y., & Hu, E. (2023, March 21). Scaling in the service of reasoning & model-based ML. Yoshua Bengio. https://​​​​2023/​​03/​​21/​​scaling-in-the-service-of-reasoning-model-based-ml/​​

Align­ing an H-JEPA agent via train­ing on the out­puts of an LLM-based “ex­em­plary ac­tor”

Roman Leventov29 May 2023 11:08 UTC
12 points
10 comments30 min readLW link

Yoshua Ben­gio ar­gues for tool-AI and to ban “ex­ec­u­tive-AI”

habryka9 May 2023 0:13 UTC
53 points
15 comments7 min readLW link

An­no­tated re­ply to Ben­gio’s “AI Scien­tists: Safe and Use­ful AI?”

Roman Leventov8 May 2023 21:26 UTC
18 points
2 comments7 min readLW link
No comments.