Hmm, interesting observation. I guess that counts as having the chemical state of our body as an input? I think defining it this way includes other similar cases such as feeling hunger and the need to sleep.
I’m not sure how useful these would be for text generation—it would probably allow for something like empathy, which would probably be good, assuming we could instill it with access to something close the emotions humans have.
I think that access to internal state would give much larger performance gains for embodied systems however—robots which are aware they need recharging are likely to be much more effective.
I’m not sure about every emotion, but I think frustration might help LLM agents break out of loops, and skeptism/uncertainty might help reasoners understand when they need to think about something more. I’m not sure about others, but yeah, “hunger” seems important in some contexts.
Hmm, interesting observation. I guess that counts as having the chemical state of our body as an input? I think defining it this way includes other similar cases such as feeling hunger and the need to sleep.
I’m not sure how useful these would be for text generation—it would probably allow for something like empathy, which would probably be good, assuming we could instill it with access to something close the emotions humans have.
I think that access to internal state would give much larger performance gains for embodied systems however—robots which are aware they need recharging are likely to be much more effective.
I’m not sure about every emotion, but I think frustration might help LLM agents break out of loops, and skeptism/uncertainty might help reasoners understand when they need to think about something more. I’m not sure about others, but yeah, “hunger” seems important in some contexts.