One example is that the top tiers of the community are in fact composed largely of people who directly care about doing good things for the world, and this (surprise!) comes together with being extremely good at telling who’s faking it. So in fact you won’t be socially respected above a certain level until you optimize hard for altruistic goals.
Another example is that whatever your goals are, in the long run you’ll do better if you first become smart, rich, knowledgeable about AI, sign up for cryonics, prevent the world from ending etc.
Can you elaborate on this? I have the feeling that I agree now but I’m not certain what I’m agreeing with.
One example is that the top tiers of the community are in fact composed largely of people who directly care about doing good things for the world, and this (surprise!) comes together with being extremely good at telling who’s faking it. So in fact you won’t be socially respected above a certain level until you optimize hard for altruistic goals.
Another example is that whatever your goals are, in the long run you’ll do better if you first become smart, rich, knowledgeable about AI, sign up for cryonics, prevent the world from ending etc.