As in the OP, I strongly agree with you that it’s a bad idea to go around saying “my cause is more important than your cause”. If anyone reading this right now is thinking “yeah maybe it pisses people off but it’s true pffffft”, then I would note that rationalist-sphere people bought into AI x-risk are nevertheless generally quite capable of caring about things that are not important compared to AI x-risk from a Cause Prioritization perspective, like YIMBY issues, how much the FDA sucks, the replication crisis, price gouging, etc., and if they got judged harshly for caring about FDA insanity when the future of the galaxy is at stake from AI, it would be pretty annoying, so by the same token they shouldn’t judge others harshly for caring about whatever causes they happen to care about. (But I’m strongly in favor of people (like me) who think AI x-risk is real and high trying to convince other people of that.)
As in the OP, I strongly agree with you that it’s a bad idea to go around saying “my cause is more important than your cause”. If anyone reading this right now is thinking “yeah maybe it pisses people off but it’s true pffffft”, then I would note that rationalist-sphere people bought into AI x-risk are nevertheless generally quite capable of caring about things that are not important compared to AI x-risk from a Cause Prioritization perspective, like YIMBY issues, how much the FDA sucks, the replication crisis, price gouging, etc., and if they got judged harshly for caring about FDA insanity when the future of the galaxy is at stake from AI, it would be pretty annoying, so by the same token they shouldn’t judge others harshly for caring about whatever causes they happen to care about. (But I’m strongly in favor of people (like me) who think AI x-risk is real and high trying to convince other people of that.)