Many people (specifically, people over at RationalWiki, and probably elsewhere as well) see the community as being insular, or as being a Yudkowsky Personality Cult, or think that some of the weirder-sounding ideas widely espoused here (cryonics, FAI, etc) “might benefit from a better grounding in reality”.
Still others reflexively write LW off based on the use of fanfiction (a word of dread and derision in many circles) to recruit members.
Even the jargon derived from the Sequences may put some people off. Despite the staunch avoidance of hot-button politics, they still import a few lesser controversies. For example, there still exist people who outright reject Bayesian probability, and there are many more who see Bayes’ theorem as a tool that is valid only in a very narrow domain. Brazenly disregarding their opinion can be seen as haughty, even if the maths are on your side.
The example you give to prove plausibility is also a counterexample to the argument you make immediately afterwards. We know that less-intelligent or even non-intelligent things can produce greater intelligence because humans evolved, and evolution is not intelligent.
It’s more a matter of whether we have enough time to drudge something reasonable out of the problem space. If we were smarter we could search it faster.