Do they have to share a structure though? I was trying to present a possible exception.
Part of my point was that evolutionary processes might be able to create two different processes that are behaviorally equivalent. It could be something analogous to two machines giving the same output through entirely different inner processes.
‘Love’ is ultimately just a high-level description of behavior in living systems, people are even arguing how much our folk mental concepts are accurate descriptions of brain states.
There’s also the fact that we can seemingly abstract as much as we want the precision of structures to define ‘things’, specially with artifacts, like tables (because of the metaphysical concept of ‘purpose’). If we unrestrict composition, that any two instances of what we call the same thing share a common descriptive basic structure becomes trivial. I believe this is part of the reason numbers are so ‘useful’, they are such generic/basic/undefined constructs that is difficult to imagine an universe in which there’s no stuff that we can abstract into things to count.
If we don’t want the idea of shared structure to be trivial, we need to restrict the idea of ‘thing’ to something that is ‘natural’ instead of ‘social’. For an emotion to be ‘natural’, it means that it’s an accurate and consistent description of a brain state separate from other so-called ‘emotions’ with somewhat defined boundaries. I’m assuming a computational theory of mind.
Then, if we have ‘human love ‘and ‘alien love’, it might just be that the neural structures that identify ‘alien love’ are not anymore similar to ‘human love’ than to any other human emotions. Then the thing that they would share to be called “love” would be their behavioral effects, their effects on the peripherals of the brain-body system and their subsequent behavior. This might be possible through the process of convergent evolution; if it’s not possible, that means there has to be a shared computational structure for the same behavioral program called that we identify as ‘love’.
In a theory of embodied mind, the peripherals would be part of the mental process of ‘love’, so they would probably need to share a common structure by necessity.
Do they have to share a structure though? I was trying to present a possible exception. Part of my point was that evolutionary processes might be able to create two different processes that are behaviorally equivalent. It could be something analogous to two machines giving the same output through entirely different inner processes.
‘Love’ is ultimately just a high-level description of behavior in living systems, people are even arguing how much our folk mental concepts are accurate descriptions of brain states.
There’s also the fact that we can seemingly abstract as much as we want the precision of structures to define ‘things’, specially with artifacts, like tables (because of the metaphysical concept of ‘purpose’). If we unrestrict composition, that any two instances of what we call the same thing share a common descriptive basic structure becomes trivial. I believe this is part of the reason numbers are so ‘useful’, they are such generic/basic/undefined constructs that is difficult to imagine an universe in which there’s no stuff that we can abstract into things to count.
If we don’t want the idea of shared structure to be trivial, we need to restrict the idea of ‘thing’ to something that is ‘natural’ instead of ‘social’. For an emotion to be ‘natural’, it means that it’s an accurate and consistent description of a brain state separate from other so-called ‘emotions’ with somewhat defined boundaries. I’m assuming a computational theory of mind.
Then, if we have ‘human love ‘and ‘alien love’, it might just be that the neural structures that identify ‘alien love’ are not anymore similar to ‘human love’ than to any other human emotions. Then the thing that they would share to be called “love” would be their behavioral effects, their effects on the peripherals of the brain-body system and their subsequent behavior. This might be possible through the process of convergent evolution; if it’s not possible, that means there has to be a shared computational structure for the same behavioral program called that we identify as ‘love’.
In a theory of embodied mind, the peripherals would be part of the mental process of ‘love’, so they would probably need to share a common structure by necessity.