On the wider set of cases you hint at, my current view would be that there are only two cases that I’m ethically comfortable with:
an evolved sapient being with the usual self-interested behavior for that that our ethical system grants moral patient status (by default, roughly equal moral patient status, subject to some of the issues discussed in Part 5)
an aligned constructed agent whose motivations are entirely creator-interested and actively doesn’t want moral patient status (see Part 1 of this sequence for a detailed justification of this)
Everything else: domesticated animals, non-aligned AIs kept in line by threat of force, slavery, uploads, and so forth, I’m (to varying degrees obviously) concerned about the ethics of, but haven’t really thought several of those through in detail. Not that we currently have much choice about domesticated animals, but I feel that at a minimum by creating them we take on a responsibility for them: it’s now our job to shear all the sheep, for example.
On the wider set of cases you hint at, my current view would be that there are only two cases that I’m ethically comfortable with:
an evolved sapient being with the usual self-interested behavior for that that our ethical system grants moral patient status (by default, roughly equal moral patient status, subject to some of the issues discussed in Part 5)
an aligned constructed agent whose motivations are entirely creator-interested and actively doesn’t want moral patient status (see Part 1 of this sequence for a detailed justification of this)
Everything else: domesticated animals, non-aligned AIs kept in line by threat of force, slavery, uploads, and so forth, I’m (to varying degrees obviously) concerned about the ethics of, but haven’t really thought several of those through in detail. Not that we currently have much choice about domesticated animals, but I feel that at a minimum by creating them we take on a responsibility for them: it’s now our job to shear all the sheep, for example.