My strong guess is that AIs won’t by default care about other sentient minds
nit: this presupposes that the de novo mind is itself sentient, which I think you’re (rightly) trying to leave unresolved (because it is unresolved). I’d write
My strong guess is that AIs won’t by default care about sentient minds, even if they are themselves sentient
(Unless you really are trying to connect alignment necessarily with building a sentient mind, in which case I’d suggest making that more explicit)
nit: this presupposes that the de novo mind is itself sentient, which I think you’re (rightly) trying to leave unresolved (because it is unresolved). I’d write
(Unless you really are trying to connect alignment necessarily with building a sentient mind, in which case I’d suggest making that more explicit)