This is my inclination, but a physicalist either predicts that the phenomenology would in fact change, or perhaps asserts that you’re deluded about your phenomenal experience when you think that the experience is the same despite substrate shifts. My understanding of cube_flipper’s position is that they anticipate changes in the substrate to change the qualia.
From a physicalist’s perspective, you’re essentially making predictions based on your theory of phenomenal consciousness, and then arguing that we should already update on those predictions ahead of time, since they’re so firm. I’m personally sympathetic to this line of argument, but it obviously depends on some assumptions which need to be articulated, and which the physicalist would probably not be happy to make.
I guess I am a bit confused about the process of encoding phenomenological data into bits. If a physicalist is doing it, they might include (from a computationalist perspective) unnecessary detail about the movement of subatomic particles. If a computationalist is doing it, they might (from a physicalist perspective) exclude important detail about EM fields that affect qualia. Or is there a common ground on which both perspectives can agree?
Trying to answer my own question: the obvious way is to have everything encoded, down to every quantum fluctuation. In that case, the computationalist hypothesis has to explain all of the thermal noise in addition to consciousness, which seems unfair to me, since it is a theory of consciousness, not of physics.
This is my inclination, but a physicalist either predicts that the phenomenology would in fact change, or perhaps asserts that you’re deluded about your phenomenal experience when you think that the experience is the same despite substrate shifts. My understanding of cube_flipper’s position is that they anticipate changes in the substrate to change the qualia.
From a physicalist’s perspective, you’re essentially making predictions based on your theory of phenomenal consciousness, and then arguing that we should already update on those predictions ahead of time, since they’re so firm. I’m personally sympathetic to this line of argument, but it obviously depends on some assumptions which need to be articulated, and which the physicalist would probably not be happy to make.
I guess I am a bit confused about the process of encoding phenomenological data into bits. If a physicalist is doing it, they might include (from a computationalist perspective) unnecessary detail about the movement of subatomic particles. If a computationalist is doing it, they might (from a physicalist perspective) exclude important detail about EM fields that affect qualia. Or is there a common ground on which both perspectives can agree?
Trying to answer my own question: the obvious way is to have everything encoded, down to every quantum fluctuation. In that case, the computationalist hypothesis has to explain all of the thermal noise in addition to consciousness, which seems unfair to me, since it is a theory of consciousness, not of physics.