Does not sound plausible to me. If all worries about AI somehow magically disappeared overnight (God descends from Heaven and provides a 100% credible mathematical proof that any superhuman AI will necessarily be good), Yudkowsky would still be the guy who wrote the Sequences, founded the rationalist community, created a website where the quality of discourse is visibly higher than on the rest of the internet, etc. With the threat of AI out of the way, the rationalist community would probably focus again on developing the art of human rationality, increasing the sanity waterline, etc.
Also, your argument could be used to dismiss anything. Doctors talking about cancer? They just worry that if people are no longer afraid of diseases, no one will treat the doctors as high-status anymore. Etc.
Does not sound plausible to me. If all worries about AI somehow magically disappeared overnight (God descends from Heaven and provides a 100% credible mathematical proof that any superhuman AI will necessarily be good), Yudkowsky would still be the guy who wrote the Sequences, founded the rationalist community, created a website where the quality of discourse is visibly higher than on the rest of the internet, etc. With the threat of AI out of the way, the rationalist community would probably focus again on developing the art of human rationality, increasing the sanity waterline, etc.
Also, your argument could be used to dismiss anything. Doctors talking about cancer? They just worry that if people are no longer afraid of diseases, no one will treat the doctors as high-status anymore. Etc.