If such official, at-the-edge outputs are all that matters for computationalism, then dumb-but-fast Look Up Tables could be conscious, which is a problem.
That’s not what I claimed, in fact, I was trying to be careful to discredit that. I said the system can be arbitrarily divided, and replacing any part with a different part/black box that gives the same outputs as the original would have would not affect the rest of the system. Some patterns of replacement of parts remove the conscious parts. Some do not.
This is important because I am trying to establish “red” and other phenomena as relational properties of a system containing both me and a red object. This is something that I think distinguishes my answer from others’.
I’m distinguishing further between removing my eyes and the red object and replacing them with a black box sending inputs into my optic nerves, which preserves consciousness, and replacing my brain with a black box lookup table and keeping my eyes and the object intact, which removes the conscious subsystem of the larger system. Note that some form of the larger system is a requirement for seeing red.
My answer highlights how only some parts of the conscious system are necessary for the output we cal consciousness, and makes sure we don’t confuse ourselves and think that all elements of the conscious computing system are essential to consciousness, or that all may be replaced.
The algorithm is sensitive to certain replacement of its parts with functions, but not others.
That’s not what I claimed, in fact, I was trying to be careful to discredit that. I said the system can be arbitrarily divided, and replacing any part with a different part/black box that gives the same outputs as the original would have would not affect the rest of the system. Some patterns of replacement of parts remove the conscious parts. Some do not.
This is important because I am trying to establish “red” and other phenomena as relational properties of a system containing both me and a red object. This is something that I think distinguishes my answer from others’.
I’m distinguishing further between removing my eyes and the red object and replacing them with a black box sending inputs into my optic nerves, which preserves consciousness, and replacing my brain with a black box lookup table and keeping my eyes and the object intact, which removes the conscious subsystem of the larger system. Note that some form of the larger system is a requirement for seeing red.
My answer highlights how only some parts of the conscious system are necessary for the output we cal consciousness, and makes sure we don’t confuse ourselves and think that all elements of the conscious computing system are essential to consciousness, or that all may be replaced.
The algorithm is sensitive to certain replacement of its parts with functions, but not others.