Some elements of moral thought seem to reflect an underlying reality akin to mathematical truth. “Fairness” for instance naturally relates to equal divisions, reciprocity, and symmetries akin to Rawls’ veil of ignorance. Eliezer discusses this here: if Y thinks “fair” is splitting the pie evenly and Z thinks “fair” is “Z gets the whole pie”, Y is just right and Z is just wrong.
Even though human moral judgment is evolved, what it is evolved to do includes mapping out symmetries like “it’s wrong for A to murder B, or for B to murder A” → “it’s wrong for anyone to murder anyone” → “murder is generically wrong”.
It is useful for evolved mental machinery for enabling cooperation and conflict resolution to have features like what you describe, yes. I don’t agree that this points towards there being an underlying reality.
Some elements of moral thought seem to reflect an underlying reality akin to mathematical truth. “Fairness” for instance naturally relates to equal divisions, reciprocity, and symmetries akin to Rawls’ veil of ignorance. Eliezer discusses this here: if Y thinks “fair” is splitting the pie evenly and Z thinks “fair” is “Z gets the whole pie”, Y is just right and Z is just wrong.
Even though human moral judgment is evolved, what it is evolved to do includes mapping out symmetries like “it’s wrong for A to murder B, or for B to murder A” → “it’s wrong for anyone to murder anyone” → “murder is generically wrong”.
It is useful for evolved mental machinery for enabling cooperation and conflict resolution to have features like what you describe, yes. I don’t agree that this points towards there being an underlying reality.