Some people expressed a reaction of scepticism over this:
assigned double-digit probabilities to bacteria having qualia and said they wouldn’t be surprised if a balloon flying through a gradient of air experiences pain
I think it’s likely that even simple “RL algorithms” might have a very limited, very shallow, non-self-aware kinds of experience: an image-classifier is doing visual-information-processing, so it probably also produces isolated “experiences of vision”
Not sure if they expect a small CNN to possess qualia (and do they then think that when physics makes essentially equivalent matrix multiplications to compute rocks, there are a lot of qualia of random visions in rocks?), but maybe it’s easy to underestimate how confused many people are about all that stuff
Some people expressed a reaction of scepticism over this:
Here’s something from a comment on the EA Forum:
Not sure if they expect a small CNN to possess qualia (and do they then think that when physics makes essentially equivalent matrix multiplications to compute rocks, there are a lot of qualia of random visions in rocks?), but maybe it’s easy to underestimate how confused many people are about all that stuff