Yeah, non-human animals remain a tricky subject. For example, I’m pretty sure thermostats are minimally conscious in a technical sense, yet probably don’t suffer in any meaningful way because they have no way to experience pain as pain to the extent we allow pain to include things like negative-valence feedback (and what does “negative valence” even mean for a thermostat?). Yet somewhere along the way we get things conscious enough that we can suspect them of suffering the way we do, or suffering the way we do but to a lesser degree.
I like your thought that maybe humans, or generally more conscious processes (in the IIT sense that consciousness can be quantified), are capable of more suffering as it seems to line up with my expectation that things that experience themselves more have more opportunity to experience suffering. This has interesting implications too for the potential suffering of AIs and other future things which may be more conscious than anything that presently exists.
Yeah, non-human animals remain a tricky subject. For example, I’m pretty sure thermostats are minimally conscious in a technical sense, yet probably don’t suffer in any meaningful way because they have no way to experience pain as pain to the extent we allow pain to include things like negative-valence feedback (and what does “negative valence” even mean for a thermostat?). Yet somewhere along the way we get things conscious enough that we can suspect them of suffering the way we do, or suffering the way we do but to a lesser degree.
I like your thought that maybe humans, or generally more conscious processes (in the IIT sense that consciousness can be quantified), are capable of more suffering as it seems to line up with my expectation that things that experience themselves more have more opportunity to experience suffering. This has interesting implications too for the potential suffering of AIs and other future things which may be more conscious than anything that presently exists.