Eliminating humans seems directly contradictory to the idea of being a positive force for all sentient beings. If humans are killed and don’t want to be killed, it’s bad for them, so not positive for all (you might argue it’s positive on net (I’d probably disagree!), but that’s different than positive for all).
I think your comment misunderstands the idea of moral alignment, but I’d be curious for you to give a more detailed comment explaining your position.
Eliminating humans seems directly contradictory to the idea of being a positive force for all sentient beings. If humans are killed and don’t want to be killed, it’s bad for them, so not positive for all (you might argue it’s positive on net (I’d probably disagree!), but that’s different than positive for all).
I think your comment misunderstands the idea of moral alignment, but I’d be curious for you to give a more detailed comment explaining your position.
It’s not against it in a total utilitarianism sense. It’s contradictory to Pareto utilitarianism
Maybe you think it’s obvious that some kind of Pareto principle or Do No Harm principle was intended..but it wasn’t obvious to me.