I don’t think you got the question.
You see, if we define “shouldness” as optimization of human values. Then it does indeed logically follows that people should act altruistically:
People should do what they should
Should = Optimization of human values
People should do what optimizes human values
Altruism ∈ Human Values
People should do altruism
Is it what you were looking for?
By “should” I mean any currently accepted model that you can derive alturism from, of which the only one I know of so far is evolution or stems from evolution.
I don’t think you got the question.
You see, if we define “shouldness” as optimization of human values. Then it does indeed logically follows that people should act altruistically:
People should do what they should
Should = Optimization of human values
People should do what optimizes human values
Altruism ∈ Human Values
People should do altruism
Is it what you were looking for?
By “should” I mean any currently accepted model that you can derive alturism from, of which the only one I know of so far is evolution or stems from evolution.