Crit­i­cisms of The Ra­tion­al­ist Movement

TagLast edit: 18 Jan 2024 14:13 UTC by FallCheetah7373

Criticisms of the rationalist movement and LessWrong have existed for most of its duration on various grounds.

Cult of Rationality

Less Wrong has been referred to as a cult phyg on numerous occasions,123 with Eliezer Yudkowsky as its leader. Eliezer’s confidence in his AI safety work outside of mainstream academia and self-professed intelligence renders him highly unpopular with his critics.4


The Neoreaction movement,5 is a notoriously adjacent idea to the community. Whist it has being explicitly refuted by figures such as Eliezer67 and Scott,8 it is often actively associated by critics.910


The movement has been criticized as overemphasizing inductive reasoning over empiricism,11 a criticism that has been refuted by Scott Alexander.12

Roko’s basilisk

The Roko’s basilisk thought experiment was notorious in that it required specific preconditions available nearly exclusively within the Less Wrong community that rendered the reader vulnerable to this ‘memetic hazard’. As such it has drawn derision from critics who feel perception risk from unfriendly AI is overstated within the community.1314


Less Wrong’s community was partially founded by soliciting users from the transhumanist SL4 mailing list and Eliezer Yudkowsky is himself a prominent transhumanist.

As such, the fringe nature of transhumanist ideas such as cryonics, AGI takeover15 has met with continued scorn from the skeptics based at RationalWiki.16

See also


  1. http://​​​​lw/​​4d/​​youre_calling_who_a_cult_leader/​​

  2. http://​​​​lw/​​bql/​​our_phyg_is_not_exclusive_enough/​​

  3. https://​​​​r/​​OutOfTheLoop/​​comments/​​3ttw2e/​​what_is_lesswrong_and_why_do_people_say_it_is_a/​​

  4. http://​​​​wiki/​​Eliezer_Yudkowsky

  5. https://​​​​web/​​20130424060436/​​http://​​​​2013/​​04/​​21/​​visualizing-neoreaction/​​

  6. http://​​​​post/​​142497361345/​​this-isnt-going-to-work-but-for-the-record-and

  7. http://​​​​lw/​​fh4/​​why_is_mencius_moldbug_so_popular_on_less_wrong/​​

  8. http://​​​​2013/​​10/​​20/​​the-anti-reactionary-faq/​​

  9. http://​​​​wiki/​​Neoreactionary_movement

  10. https://​​​​wiki/​​The_Silicon_Ideology

  11. https://​​​​almostdiamonds/​​2014/​​11/​​24/​​why-i-am-not-a-rationalist/​​

  12. http://​​​​2014/​​11/​​27/​​why-i-am-not-rene-descartes/​​

  13. http://​​​​wiki/​​Roko’s_basilisk

  14. http://​​​​talks/​​superintelligence.htm

  15. http://​​​​wiki/​​Cybernetic_revolt

  16. http://​​​​wiki/​​Transhumanism

Agree­ing With Stalin in Ways That Ex­hibit Gen­er­ally Ra­tion­al­ist Principles

Zack_M_Davis2 Mar 2024 22:05 UTC
36 points
19 comments58 min readLW link

Public-fac­ing Cen­sor­ship Is Safety Theater, Caus­ing Rep­u­ta­tional Da­m­age

Yitz23 Sep 2022 5:08 UTC
149 points
42 comments6 min readLW link

Some blindspots in ra­tio­nal­ity and effec­tive altruism

Remmelt19 Mar 2021 11:40 UTC
37 points
44 comments14 min readLW link

A Hill of Val­idity in Defense of Meaning

Zack_M_Davis15 Jul 2023 17:57 UTC
8 points
118 comments75 min readLW link

If Clar­ity Seems Like Death to Them

Zack_M_Davis30 Dec 2023 17:40 UTC
40 points
191 comments87 min readLW link

Mus­ings on Cargo Cult Consciousness

Gareth Davidson25 Jan 2024 23:00 UTC
−13 points
11 comments17 min readLW link

Blan­chard’s Danger­ous Idea and the Plight of the Lu­cid Crossdreamer

Zack_M_Davis8 Jul 2023 18:03 UTC
37 points
135 comments72 min readLW link

Crit­i­cism of some pop­u­lar LW articles

DirectedEvolution19 Jul 2020 1:16 UTC
71 points
19 comments6 min readLW link

No Really, Why Aren’t Ra­tion­al­ists Win­ning?

Sailor Vulcan4 Nov 2018 18:11 UTC
35 points
90 comments5 min readLW link

Self-Con­grat­u­la­tory Rationalism

ChrisHallquist1 Mar 2014 8:52 UTC
73 points
395 comments10 min readLW link

Ex­treme Ra­tion­al­ity: It’s Not That Great

Scott Alexander9 Apr 2009 2:44 UTC
242 points
281 comments8 min readLW link

Mo­ti­vated Cog­ni­tion and the Mul­ti­verse of Truth

Q Home22 Nov 2022 12:51 UTC
8 points
16 comments24 min readLW link

EIS V: Blind Spots In AI Safety In­ter­pretabil­ity Research

scasper16 Feb 2023 19:09 UTC
54 points
23 comments13 min readLW link

You’re Cal­ling *Who* A Cult Leader?

Eliezer Yudkowsky22 Mar 2009 6:57 UTC
67 points
121 comments5 min readLW link

Our Phyg Is Not Ex­clu­sive Enough

[deleted]14 Apr 2012 21:08 UTC
43 points
518 comments3 min readLW link

Pu­tanu­monit: If ra­tio­nal­ity is a re­li­gion, it’s a crappy one.

Jacob Falkovich15 Apr 2017 16:44 UTC
8 points
0 comments1 min readLW link

[LINK] EA Has A Ly­ing Problem

Benquo11 Jan 2017 22:31 UTC
28 points
34 comments1 min readLW link

Why I Am Not a Ra­tion­al­ist, or, why sev­eral of my friends warned me that this is a cult

Algernoq13 Jul 2014 17:54 UTC
20 points
193 comments3 min readLW link

Self-Im­prove­ment or Shiny Dis­trac­tion: Why Less Wrong is anti-In­stru­men­tal Rationality

patrissimo14 Sep 2010 16:17 UTC
155 points
261 comments16 min readLW link

Cri­tiques of promi­nent AI safety labs: Red­wood Research

Omega.17 Apr 2023 18:20 UTC
1 point
0 comments22 min readLW link

In defence of epistemic mod­esty [dis­til­la­tion]

Luise10 May 2023 9:44 UTC
16 points
2 comments9 min readLW link

Crit­i­cism of EA Crit­i­cism Contest

Zvi14 Jul 2022 14:30 UTC
108 points
17 comments31 min readLW link1 review

Ideas for stud­ies on AGI risk

dr_s20 Apr 2023 18:17 UTC
5 points
1 comment11 min readLW link

The Craft & The Com­mu­nity—A Post-Mortem & Resurrection

Bendini2 Nov 2017 3:45 UTC
76 points
107 comments68 min readLW link

[Linkpost] Leif We­nar’s The Deaths of Effec­tive Altruism

Arden27 Mar 2024 19:17 UTC
8 points
1 comment1 min readLW link