I think the central argument, is that subjective experience is ostensibly more profound the more information it integrates with, both at a single moment and over time. I would think of it, or any experience as, the depth of cognition and attention the stimuli controls coherence over (IE, # of feedback loops controlled or reoriented by that single bad experience—and the neural re-shuffling it requires), extrapolated over how long that ‘painful’ reprocessing continues to manifest as lived stimuli. If you have the brain of a goldfish, the pain of pinch oscillates through a significantly lower number of attention feedback loops than a human, with a much higher set of cognitive faculties getting ‘jarred’ and attention stolen to get away from that pinch. Secondly, the degree of coherence our subjectivity inhabits is likely loosely correlated as a consequence of having higher long term retention faculties. If felt pain is solely a ‘miss’ within any agent objective function, then even the smallest ML algorithms ‘hurt’ as they are. IE, subjectivity is emergent from the depth and scale of these feedback loops (which are required by nature), but not isomorphic to them (value function miss).
I think the central argument, is that subjective experience is ostensibly more profound the more information it integrates with, both at a single moment and over time. I would think of it, or any experience as, the depth of cognition and attention the stimuli controls coherence over (IE, # of feedback loops controlled or reoriented by that single bad experience—and the neural re-shuffling it requires), extrapolated over how long that ‘painful’ reprocessing continues to manifest as lived stimuli. If you have the brain of a goldfish, the pain of pinch oscillates through a significantly lower number of attention feedback loops than a human, with a much higher set of cognitive faculties getting ‘jarred’ and attention stolen to get away from that pinch. Secondly, the degree of coherence our subjectivity inhabits is likely loosely correlated as a consequence of having higher long term retention faculties. If felt pain is solely a ‘miss’ within any agent objective function, then even the smallest ML algorithms ‘hurt’ as they are. IE, subjectivity is emergent from the depth and scale of these feedback loops (which are required by nature), but not isomorphic to them (value function miss).