My guess is this is obvious, but IMO it seems extremely unlikely to me that bee-experience is remotely as important to care about as cow experience.
I agree with this, but would strike the ‘extremely’. I don’t actually have gears level models for how some algorithms produce qualia. ‘Something something, self modelling systems, strange loops’ is not a gears level model. I mostly don’t think a million neuron bee brain would be doing qualia, but I wouldn’t say I’m extremely confident.
Consequently, I don’t think people who say bees are likely to be conscious are so incredibly obviously making a mistake that we have to go looking for some signalling explanation for them producing those words.
I agree with this, but would strike the ‘extremely’. I don’t actually have gears level models for how some algorithms produce qualia. ‘Something something, self modelling systems, strange loops’ is not a gears level model. I mostly don’t think a million neuron bee brain would be doing qualia, but I wouldn’t say I’m extremely confident.
Consequently, I don’t think people who say bees are likely to be conscious are so incredibly obviously making a mistake that we have to go looking for some signalling explanation for them producing those words.