it’s important to ensure that the future will contain many people who have better lives than us
Are you swiping the complexity of value under the terms “better” and “veridical”? Does following your axiology prevent humanity from evolving into a race of happy-go-lucky clones?
Are you swiping the complexity of value under the terms “better” and “veridical”? Does following your axiology prevent humanity from evolving into a race of happy-go-lucky clones?
Yes. It’s hard enough to come up with a decent way of aggregating individual welfares without making a comprehensive theory of value.