Criticisms of the rationalist movement and LessWrong have existed for most of its duration on various grounds.
Cult of Rationality
Less Wrong has been referred to as a
cult phyg on numerous occasions,123 with Eliezer Yudkowsky as its leader. Eliezer’s confidence in his AI safety work outside of mainstream academia and self-professed intelligence renders him highly unpopular with his critics.4
The Neoreaction movement,5 is a notoriously adjacent idea to the community. Whist it has being explicitly refuted by figures such as Eliezer67 and Scott,8 it is often actively-associated by critics.910
The movement has been criticized as overemphasizing inductive reasoning over empiricism,11 a criticism that has been refuted by Scott Alexander.12
The Roko’s basilisk thought experiment was notorious in that it required specific preconditions available nearly exclusively within the Less Wrong community that rendered the reader vulnerable to this ‘memetic hazard’. As such it has drawn derision from critics who feel perception risk from unfriendly AI is overstated within the community.1314
Less Wrong’s community was partially founded by soliciting users from the transhumanist SL4 mailing list and Eliezer Yudkowsky is himself a prominent transhumanist.
As such, the fringe nature of transhumanist ideas such as cryonics, AGI takeover15 has met with continued scorn from the skeptics based at RationalWiki.16
Would be good if people could find additional posts for this tag. Couldn’t think of more from memory/keywoard search.
Completing this article
Each section should weigh and criticise the criticisms. Not enthusiastic to fix this article myself. Alti (talk) 10:36, 24 April 2017 (AEST)