[Question] Does the existence of shared human values imply alignment is “easy”?

Observation: Humans seem to actually care about each other. How was culture + evolution able to achieve this?

While not to 100% (Maybe 50-95%), I would trust most humans to not push a big red button that kills all humans, even if it had some large benefit to the experience of this human in this weird hypothetical. Does this mean that most humans are aligned? Is there a reason to believe intellectually amplified humans would not be aligned in this way? How did evolution manage to encode this value of altruism, and does the possibility of this imply, we should also be able to do it with AI?

This seems like the sort of confused question that has been asked before. If so, I’d be glad about the link.