There is a subject called ethics. It tells you how to achieve certain goals, such as maximising happiness....
Well there’s the problem: ethics does not automatically start out with a happiness-utilitarian goal. Lots of extent ethical systems use other terminal goals. For instance...
Well there’s the problem: ethics does not automatically start out with a happiness-utilitarian goal. Lots of extent ethical systems use other terminal goals. For instance...
“Such as”
Sufficient rationality will tell you how to maximize any goal, once you can clearly define the goal.
Rationality is quite helpful for clarifying goals too.