An agent takes actions which imply both a kind of prediction and a kind of desire. Is there a kind of atomic thing which implements both of these and has a natural up- and down-weighting mechanism?
For atomic predictions, we can think about a the computable traders from Garrabrant Induction. These are like little atoms of predictive power which we can stitch together into one big predictor, and which naturally come with rules for up- and down-weighting them over time.
A thermostat-ish thing is like an atomic model of prediction and desire. It “predicts” that the temperature is likely to go up if it puts the radiator on, and down otherwise, and it also wants to keep it around a certain temperature. But we can’t just stitch together a bunch of thermostats into an agent the same way we can stitch a bunch of traders into a market.
What’s the atom of agency?
An agent takes actions which imply both a kind of prediction and a kind of desire. Is there a kind of atomic thing which implements both of these and has a natural up- and down-weighting mechanism?
For atomic predictions, we can think about a the computable traders from Garrabrant Induction. These are like little atoms of predictive power which we can stitch together into one big predictor, and which naturally come with rules for up- and down-weighting them over time.
A thermostat-ish thing is like an atomic model of prediction and desire. It “predicts” that the temperature is likely to go up if it puts the radiator on, and down otherwise, and it also wants to keep it around a certain temperature. But we can’t just stitch together a bunch of thermostats into an agent the same way we can stitch a bunch of traders into a market.