That is a very different moral position than the one I hold. I’m curious what your moral intuitions about the qualia of reinforcement learning systems say to you. Have you considered that many machine learning systems seem to have systems which would compute qualia much like a nervous system, and that such systems are indeed more complex than the nervous systems of many living creatures like jellyfish?
I don’t know what to think about all that. I don’t know how to determine what the line is between having qualia and not. I just feel certain that any organism with a brain sufficiently similar to those of humans—certainly all mammals, birds, reptiles, fish, cephalopods, and arthropods—has some sort of internal experience. I’m less sure about things like jellyfish and the like. I suppose the intuition probably comes from the fact that the entities I mentioned seem to actively orient themselves in the world, but it’s hard to say.
I don’t feel comfortable speculating which AIs have qualia, or if any do at all—I am not convinced of functionalism and suspect that consciousness has something to do with the physical substrate, primarily because I can’t imagine how consciousness can be subjectively continuous (one of its most fundamental traits in my experience!) in the absence of a continuously inhabited brain (rather than being a program that can be loaded in and out of anything, and copied endlessly many times, with no fixed temporal relation between subjective moments.)
That is a very different moral position than the one I hold. I’m curious what your moral intuitions about the qualia of reinforcement learning systems say to you. Have you considered that many machine learning systems seem to have systems which would compute qualia much like a nervous system, and that such systems are indeed more complex than the nervous systems of many living creatures like jellyfish?
I don’t know what to think about all that. I don’t know how to determine what the line is between having qualia and not. I just feel certain that any organism with a brain sufficiently similar to those of humans—certainly all mammals, birds, reptiles, fish, cephalopods, and arthropods—has some sort of internal experience. I’m less sure about things like jellyfish and the like. I suppose the intuition probably comes from the fact that the entities I mentioned seem to actively orient themselves in the world, but it’s hard to say.
I don’t feel comfortable speculating which AIs have qualia, or if any do at all—I am not convinced of functionalism and suspect that consciousness has something to do with the physical substrate, primarily because I can’t imagine how consciousness can be subjectively continuous (one of its most fundamental traits in my experience!) in the absence of a continuously inhabited brain (rather than being a program that can be loaded in and out of anything, and copied endlessly many times, with no fixed temporal relation between subjective moments.)