if people really wanted to optimize for social status in the rationality community there is one easiest canonical way to do this: get good at rationality.
I think this is false: even if your final goal is to optimize for social status in the community, real rationality would still force you to locally give it up because of convergent instrumental goals. There is in fact a significant first order difference.
Can you elaborate on this? I have the feeling that I agree now but I’m not certain what I’m agreeing with.
One example is that the top tiers of the community are in fact composed largely of people who directly care about doing good things for the world, and this (surprise!) comes together with being extremely good at telling who’s faking it. So in fact you won’t be socially respected above a certain level until you optimize hard for altruistic goals.
Another example is that whatever your goals are, in the long run you’ll do better if you first become smart, rich, knowledgeable about AI, sign up for cryonics, prevent the world from ending etc.