Another reason I wouldn’t put any stock in the idea that animals aren’t conscious is that the complexity cost of a model in we are and they (other animals with complex brains) are not is many bits of information. 20 bits gives a prior probability factor of 10^-6 (2^-20). I’d say that would outweigh the larger # of animals, even if you were to include the animals in the reference class.
The complexity cost of a model in which any brain is conscious is enormous. Keep in mind that a model with consciousness has to ‘output’ qualia, concepts, thoughts… which (as far as we can tell) correspond to complex brain patterns which are physically unique to each single brain.
That is, unless the physical implementation of subjective experience is much simpler than we think it is.
Another reason I wouldn’t put any stock in the idea that animals aren’t conscious is that the complexity cost of a model in we are and they (other animals with complex brains) are not is many bits of information. 20 bits gives a prior probability factor of 10^-6 (2^-20). I’d say that would outweigh the larger # of animals, even if you were to include the animals in the reference class.
The complexity cost of a model in which any brain is conscious is enormous. Keep in mind that a model with consciousness has to ‘output’ qualia, concepts, thoughts… which (as far as we can tell) correspond to complex brain patterns which are physically unique to each single brain.
That is, unless the physical implementation of subjective experience is much simpler than we think it is.