Thanks for clarifying. To the extent that you aren’t particularly sure about consciousness comes about, it makes sense to reason about all sorts of possibilities related to capacity for experience and intensity of suffering. In general, I’m just kinda surprised that Eliezer’s view is so unusual given that he is the Eliezer Yudkowsky of the rationalist community.
My impression is that the justification for the argument your mention is something along the lines of “the primary reason one would develop a coherent picture of their own mind is so they could convey a convincing story about themselves to others—which only became a relevant need once language developed.”
I was under the impression you were focused primarily on suffering from the first two sections and the similarity of the above logic to the discussion of pain-signaling earlier. When I think about your generic argument about consciousness, I get confused however. While I can imagine why would one benefit from an internal narrative around their goals, desires, etc, I’m not even sure how I’d go about squaring pressures for that capacity with respect to the many basic sensory qualia that people have (e.g. sense of sight, sense of touch) -- especially in the context of language.
I think things like ‘the ineffable redness of red’ are a side-effect or spandrel. On my account, evolution selected for various kinds of internal cohesion and temporal consistency, introspective accessibility and verbal reportability, moral justifiability and rhetorical compellingness, etc. in weaving together a messy brain into some sort of unified point of view (with an attendant unified personality, unified knowledge, etc.).
This exerted a lot of novel pressures and constrained the solution space a lot, but didn’t constrain it 100%, so you still end up with a lot of weird neither-fitness-improving-nor-fitness-reducing anomalies when you poke at introspection.
This is not a super satisfying response, and it has basically no detail to it, but it’s the least-surprising way I could imagine things shaking out when we have a mature understanding of the mind.
Thanks for clarifying. To the extent that you aren’t particularly sure about consciousness comes about, it makes sense to reason about all sorts of possibilities related to capacity for experience and intensity of suffering. In general, I’m just kinda surprised that Eliezer’s view is so unusual given that he is the Eliezer Yudkowsky of the rationalist community.
My impression is that the justification for the argument your mention is something along the lines of “the primary reason one would develop a coherent picture of their own mind is so they could convey a convincing story about themselves to others—which only became a relevant need once language developed.”
I was under the impression you were focused primarily on suffering from the first two sections and the similarity of the above logic to the discussion of pain-signaling earlier. When I think about your generic argument about consciousness, I get confused however. While I can imagine why would one benefit from an internal narrative around their goals, desires, etc, I’m not even sure how I’d go about squaring pressures for that capacity with respect to the many basic sensory qualia that people have (e.g. sense of sight, sense of touch) -- especially in the context of language.
I think things like ‘the ineffable redness of red’ are a side-effect or spandrel. On my account, evolution selected for various kinds of internal cohesion and temporal consistency, introspective accessibility and verbal reportability, moral justifiability and rhetorical compellingness, etc. in weaving together a messy brain into some sort of unified point of view (with an attendant unified personality, unified knowledge, etc.).
This exerted a lot of novel pressures and constrained the solution space a lot, but didn’t constrain it 100%, so you still end up with a lot of weird neither-fitness-improving-nor-fitness-reducing anomalies when you poke at introspection.
This is not a super satisfying response, and it has basically no detail to it, but it’s the least-surprising way I could imagine things shaking out when we have a mature understanding of the mind.