This is a great post, but I think the argument anchors too much on valence which is a questionable requirement and the thrust of your argument goes through without it.
Concretely, imagine a philosophical Vulcan which is a creature exactly like a human with rich conscious experience but no valence. Would it be permissible to kill 5 vulcans to save 1 human? This isn’t obvious to me at all. Intuitively the fact that vulcans have rich inner conscious experience means their lives have intrinsic value, even if this experience isn’t valenced.
To be sure, I think you can just modify your argument to avoid mentioning valence. Roughly,
WBE’s have mental states that are intrinsically valuable.
LLM’s have states that are functionally equivalent to WBE’s up to a relevant degree of abstraction.
States that are functionally equivalent to a relevant degree of abstraction are equivalent in all other respects (functionalism).
Therefore LLM’s have mental states that are intrinsically valuable.
This is just a quick sketch (and 3 is probably too strong as stated) but the argument above captures the thrust of your post and doesn’t mention emotions or valence at all.
This is a great post, but I think the argument anchors too much on valence which is a questionable requirement and the thrust of your argument goes through without it.
Concretely, imagine a philosophical Vulcan which is a creature exactly like a human with rich conscious experience but no valence. Would it be permissible to kill 5 vulcans to save 1 human? This isn’t obvious to me at all. Intuitively the fact that vulcans have rich inner conscious experience means their lives have intrinsic value, even if this experience isn’t valenced.
To be sure, I think you can just modify your argument to avoid mentioning valence. Roughly,
WBE’s have mental states that are intrinsically valuable.
LLM’s have states that are functionally equivalent to WBE’s up to a relevant degree of abstraction.
States that are functionally equivalent to a relevant degree of abstraction are equivalent in all other respects (functionalism).
Therefore LLM’s have mental states that are intrinsically valuable.
This is just a quick sketch (and 3 is probably too strong as stated) but the argument above captures the thrust of your post and doesn’t mention emotions or valence at all.