I would correct “Therefore, in laying down motivational circuitry in our ancient ancestors, evolution did not have to start from scratch, and already had a reasonably complex ‘API’ for interoceptive variables.”
from the summary to something like this
”Therefore, in laying down motivational circuitry in our ancient ancestors, evolution did have to start locating ‘goals’ and relevant world-features in the learned world models. Instead, it re-used the the existing goal-specifying circuits, and implicit-world-models, existing in older organisms. Most of the goal specification is done via “binding” the older and newer world-models in some important variables. From within the newer circuitry, important part of the “API” between the models is interoception”
(Another way how to think about it: imagine a more blurry line between a “sensory signal” and “reward signal”)
Jan—well said, and I strongly agree with your perspective here.
Any theory of human values should also be consistent with the deep evolutionary history of the adaptive origins and functions of values in general—from the earliest Cambrian animals with complex nervous systems through vertebrates, social primates, and prehistoric hominids.
As William James pointed out in 1890 (paraphrasing here), human intelligence depends on humans have more evolved instincts, preferences, and values than other animals, not having fewer.
I would correct “Therefore, in laying down motivational circuitry in our ancient ancestors, evolution did not have to start from scratch, and already had a reasonably complex ‘API’ for interoceptive variables.”
from the summary to something like this
”Therefore, in laying down motivational circuitry in our ancient ancestors, evolution did have to start locating ‘goals’ and relevant world-features in the learned world models. Instead, it re-used the the existing goal-specifying circuits, and implicit-world-models, existing in older organisms. Most of the goal specification is done via “binding” the older and newer world-models in some important variables. From within the newer circuitry, important part of the “API” between the models is interoception”
(Another way how to think about it: imagine a more blurry line between a “sensory signal” and “reward signal”)
Jan—well said, and I strongly agree with your perspective here.
Any theory of human values should also be consistent with the deep evolutionary history of the adaptive origins and functions of values in general—from the earliest Cambrian animals with complex nervous systems through vertebrates, social primates, and prehistoric hominids.
As William James pointed out in 1890 (paraphrasing here), human intelligence depends on humans have more evolved instincts, preferences, and values than other animals, not having fewer.