So, in summary, here is a stab at what meaning and relating might be, in terms that might actually be (ahem) meaningful if you were building a robot from scratch.
I realize you were joking in your second use of the word “meaning(ful)” in this sentence but I actually found this connection suggestive. In order to define something in a way that we could build a robot around, that definition needs to be really operationalized and really practical. It needs to connect to some extremely grounded quantities such as bits of metal and electrons moving down wires. This grounded quality that is required of our concepts if we wish to build a robot that does the thing seems like a clue to me about what it is that makes something meaningful.
When I ask “What does my job mean?”, or “what does my relationship with my father mean?” or “what is the meaning of life?”, I’m asking “how do my high level strategic goals correspond to each other, in a way that is consistent, minimizes overhead when I shift tasks, and allows me to confidently filter out irrelevant details?”
But what about someone who has a whole life philosophy that connects together all the parts of their life, but doesn’t breathe any vibrancy into any of it? I’m picturing someone with a job, a house, a hobby or two, and a life philosophy that bottoms out with the view that it’s all a big game of getting ahead (just to take one example). It seems to me that person could have quite a high level of integration between their goals, but at the same time could experience quite low meaning in their life. I’d expect this absence of meaning to manifest in specific ways, such as a kind of tense melancholy that pervades life.
It seems to me that person could have quite a high level of integration between their goals, but at the same time could experience quite low meaning in their life.
Hmm, yeah I think you have convinced me the current frame is insufficient.
Some further musings… (epistemic status: who knows?)
Seems like there’s at least a few things going on
alignment-of-purposes, and a sense of “I’m doing the thing I’m supposed to be doing.”
“the thing I’m doing here matters, somehow.”
“I feel vibrant / excited about the things I’m doing.”
Number 2 I am perhaps most confused about. Will come back to that in a sec.
Number 3 seems to decompose into “why would you build a robot that had vibrance/excitement, or emotions in general.” I don’t think I can give a technical answer here that I clearly understand, but I have a vague fuzzy model of “emotions are what feedback loops feel like from the inside when the feedback loops are constructed some-particular-way.” I don’t know what-particular-way the feedback loops need to be constructed as to generate the internal feeling of vibrance/excitement, but… I feel sort of okay about that level of mysteriousness. It feels like a blank spot in my map, but not a confusing blank spot in my map.
I suspect if we built a robot on purpose, we’d ideally want to do it without the particular kind of feedback-loops/emotions that humans have. But, if I’m dumb ol’ evolution building robots however I can without the ability to think more than one-generation-ahead… I can imagine building some things with emotions, one of which is some kind of vibrance, excitement, enthusiasm, etc. And then when that organism ends up having to build high level strategic planning in confusing domains, the architecture for those emotions-and-corresponding-qualia ends up being one of the building blocks that the meaningmaking process gets constructed out of.
...
returning to #2:
So one thing that comes up in the OP is that humans don’t just have to fill in an ontology beneath a clear-cut goal. They also have multiple goals, and have to navigate between them. As they fill in their ontology that connects their various goals, they have to guess at how to construct the high level goals that subgoals nest under.
StarCraftBot has to check “does this matter, or not?” for various actions like “plan an attack, establish a new base, etc.” But it has a clear ultimate goal that unambiguously matters a particular way, which it probably wouldn’t be necessary to have complex emotions about.
But for us, “what is the higher level goal? Do we have a thing that matters or not?” is something we’re more fundamentally confused about, and having a barometer for “have we figured out if we’re doing things that matter” is more actually useful.
I realize you were joking in your second use of the word “meaning(ful)” in this sentence but I actually found this connection suggestive. In order to define something in a way that we could build a robot around, that definition needs to be really operationalized and really practical. It needs to connect to some extremely grounded quantities such as bits of metal and electrons moving down wires. This grounded quality that is required of our concepts if we wish to build a robot that does the thing seems like a clue to me about what it is that makes something meaningful.
But what about someone who has a whole life philosophy that connects together all the parts of their life, but doesn’t breathe any vibrancy into any of it? I’m picturing someone with a job, a house, a hobby or two, and a life philosophy that bottoms out with the view that it’s all a big game of getting ahead (just to take one example). It seems to me that person could have quite a high level of integration between their goals, but at the same time could experience quite low meaning in their life. I’d expect this absence of meaning to manifest in specific ways, such as a kind of tense melancholy that pervades life.
Hmm, yeah I think you have convinced me the current frame is insufficient.
Some further musings… (epistemic status: who knows?)
Seems like there’s at least a few things going on
alignment-of-purposes, and a sense of “I’m doing the thing I’m supposed to be doing.”
“the thing I’m doing here matters, somehow.”
“I feel vibrant / excited about the things I’m doing.”
Number 2 I am perhaps most confused about. Will come back to that in a sec.
Number 3 seems to decompose into “why would you build a robot that had vibrance/excitement, or emotions in general.” I don’t think I can give a technical answer here that I clearly understand, but I have a vague fuzzy model of “emotions are what feedback loops feel like from the inside when the feedback loops are constructed some-particular-way.” I don’t know what-particular-way the feedback loops need to be constructed as to generate the internal feeling of vibrance/excitement, but… I feel sort of okay about that level of mysteriousness. It feels like a blank spot in my map, but not a confusing blank spot in my map.
I suspect if we built a robot on purpose, we’d ideally want to do it without the particular kind of feedback-loops/emotions that humans have. But, if I’m dumb ol’ evolution building robots however I can without the ability to think more than one-generation-ahead… I can imagine building some things with emotions, one of which is some kind of vibrance, excitement, enthusiasm, etc. And then when that organism ends up having to build high level strategic planning in confusing domains, the architecture for those emotions-and-corresponding-qualia ends up being one of the building blocks that the meaningmaking process gets constructed out of.
...
returning to #2:
So one thing that comes up in the OP is that humans don’t just have to fill in an ontology beneath a clear-cut goal. They also have multiple goals, and have to navigate between them. As they fill in their ontology that connects their various goals, they have to guess at how to construct the high level goals that subgoals nest under.
StarCraftBot has to check “does this matter, or not?” for various actions like “plan an attack, establish a new base, etc.” But it has a clear ultimate goal that unambiguously matters a particular way, which it probably wouldn’t be necessary to have complex emotions about.
But for us, “what is the higher level goal? Do we have a thing that matters or not?” is something we’re more fundamentally confused about, and having a barometer for “have we figured out if we’re doing things that matter” is more actually useful.
Maybe. idk.