Conscious phenomenology should only arise in systems whose internal states model both the world and their own internal dynamics as an observer within that world. Neural or artificial systems that lack such recursive architectures should not report or behave as though they experience an “inner glow.”
What part of staring at a white wall without inner dialog and then later remembering it requires inner modeling at the moment of staring?
Internal shifts in attention and expectation can alter what enters conscious awareness, even when sensory input remains constant. This occurs in binocular rivalry and various perceptual illusions,17 consistent with consciousness depending on recursive self-modeling rather than non-cyclic processing of external signals.
But why would changing processing to non-cyclic result in experience becoming unconscious, instead of, I don’t know, conscious, but less filtered by attention?
And as usual, do you then consider any program, that reads it’s own code, to be conscious?
What part of staring at a white wall without inner dialog and then later remembering it requires inner modeling at the moment of staring?
But why would changing processing to non-cyclic result in experience becoming unconscious, instead of, I don’t know, conscious, but less filtered by attention?
And as usual, do you then consider any program, that reads it’s own code, to be conscious?