Describing humans using a “utility function” or through “goals” is wrong.
Humans are a bunch of habits (like CFAR TAPs) which have some correlation with working towards goals, but this is more of an imperfect rationalization than a reasonable/natural way to describe the situation.
Also yes, we have some part that thinks in goals, but it has a very limited effect on anything (like actions) compared to what we’d naturally think.
Credit to a friend
[I have no idea what I’m talking about, feel free to ignore if this doesn’t resonate of course, seemed worth a comment]
Hypothesis regarding your confusion about agency:
Describing humans using a “utility function” or through “goals” is wrong.
Humans are a bunch of habits (like CFAR TAPs) which have some correlation with working towards goals, but this is more of an imperfect rationalization than a reasonable/natural way to describe the situation.
Also yes, we have some part that thinks in goals, but it has a very limited effect on anything (like actions) compared to what we’d naturally think.
Credit to a friend
[I have no idea what I’m talking about, feel free to ignore if this doesn’t resonate of course, seemed worth a comment]