RSS

Crit­i­cisms of The Ra­tion­al­ist Movement

TagLast edit: 28 Jun 2022 15:21 UTC by Ilya Venger

Criticisms of the rationalist movement and LessWrong have existed for most of its duration on various grounds.

Cult of Rationality

Less Wrong has been referred to as a cult phyg on numerous occasions,123 with Eliezer Yudkowsky as its leader. Eliezer’s confidence in his AI safety work outside of mainstream academia and self-professed intelligence renders him highly unpopular with his critics.4

Neoreaction

The Neoreaction movement,5 is a notoriously adjacent idea to the community. Whist it has being explicitly refuted by figures such as Eliezer67 and Scott,8 it is often actively-associated by critics.910

Rationalism

The movement has been criticized as overemphasizing inductive reasoning over empiricism,11 a criticism that has been refuted by Scott Alexander.12

Roko’s basilisk

The Roko’s basilisk thought experiment was notorious in that it required specific preconditions available nearly exclusively within the Less Wrong community that rendered the reader vulnerable to this ‘memetic hazard’. As such it has drawn derision from critics who feel perception risk from unfriendly AI is overstated within the community.1314

Transhumanism

Less Wrong’s community was partially founded by soliciting users from the transhumanist SL4 mailing list and Eliezer Yudkowsky is himself a prominent transhumanist.

As such, the fringe nature of transhumanist ideas such as cryonics, AGI takeover15 has met with continued scorn from the skeptics based at RationalWiki.16

See also

References

  1. http://​​lesswrong.com/​​lw/​​4d/​​youre_calling_who_a_cult_leader/​​

  2. http://​​lesswrong.com/​​lw/​​bql/​​our_phyg_is_not_exclusive_enough/​​

  3. https://​​www.reddit.com/​​r/​​OutOfTheLoop/​​comments/​​3ttw2e/​​what_is_lesswrong_and_why_do_people_say_it_is_a/​​

  4. http://​​rationalwiki.org/​​wiki/​​Eliezer_Yudkowsky

  5. https://​​web.archive.org/​​web/​​20130424060436/​​http://​​habitableworlds.wordpress.com/​​2013/​​04/​​21/​​visualizing-neoreaction/​​

  6. http://​​yudkowsky.tumblr.com/​​post/​​142497361345/​​this-isnt-going-to-work-but-for-the-record-and

  7. http://​​lesswrong.com/​​lw/​​fh4/​​why_is_mencius_moldbug_so_popular_on_less_wrong/​​

  8. http://​​slatestarcodex.com/​​2013/​​10/​​20/​​the-anti-reactionary-faq/​​

  9. http://​​rationalwiki.org/​​wiki/​​Neoreactionary_movement

  10. https://​​hpluspedia.org/​​wiki/​​The_Silicon_Ideology

  11. https://​​the-orbit.net/​​almostdiamonds/​​2014/​​11/​​24/​​why-i-am-not-a-rationalist/​​

  12. http://​​slatestarcodex.com/​​2014/​​11/​​27/​​why-i-am-not-rene-descartes/​​

  13. http://​​rationalwiki.org/​​wiki/​​Roko’s_basilisk

  14. http://​​idlewords.com/​​talks/​​superintelligence.htm

  15. http://​​rationalwiki.org/​​wiki/​​Cybernetic_revolt

  16. http://​​rationalwiki.org/​​wiki/​​Transhumanism

Public-fac­ing Cen­sor­ship Is Safety Theater, Caus­ing Rep­u­ta­tional Da­m­age

Yitz23 Sep 2022 5:08 UTC
145 points
42 comments6 min readLW link

Some blindspots in ra­tio­nal­ity and effec­tive altruism

Remmelt19 Mar 2021 11:40 UTC
39 points
44 comments14 min readLW link

Ex­treme Ra­tion­al­ity: It’s Not That Great

Scott Alexander9 Apr 2009 2:44 UTC
240 points
281 comments8 min readLW link

No Really, Why Aren’t Ra­tion­al­ists Win­ning?

Sailor Vulcan4 Nov 2018 18:11 UTC
30 points
89 comments5 min readLW link

Self-Con­grat­u­la­tory Rationalism

ChrisHallquist1 Mar 2014 8:52 UTC
73 points
395 comments10 min readLW link

Crit­i­cism of some pop­u­lar LW articles

DirectedEvolution19 Jul 2020 1:16 UTC
71 points
19 comments6 min readLW link

Self-Im­prove­ment or Shiny Dis­trac­tion: Why Less Wrong is anti-In­stru­men­tal Rationality

patrissimo14 Sep 2010 16:17 UTC
149 points
261 comments16 min readLW link

Crit­i­cism of EA Crit­i­cism Contest

Zvi14 Jul 2022 14:30 UTC
107 points
15 comments31 min readLW link
(thezvi.wordpress.com)

The Craft & The Com­mu­nity—A Post-Mortem & Resurrection

Bendini2 Nov 2017 3:45 UTC
75 points
106 comments68 min readLW link

Mo­ti­vated Cog­ni­tion and the Mul­ti­verse of Truth

Q Home22 Nov 2022 12:51 UTC
8 points
16 comments24 min readLW link

EIS V: Blind Spots In AI Safety In­ter­pretabil­ity Research

scasper16 Feb 2023 19:09 UTC
51 points
23 comments10 min readLW link

Cri­tiques of promi­nent AI safety labs: Red­wood Research

Omega.17 Apr 2023 18:20 UTC
11 points
0 comments22 min readLW link
(forum.effectivealtruism.org)

Ideas for stud­ies on AGI risk

dr_s20 Apr 2023 18:17 UTC
5 points
1 comment11 min readLW link

In defence of epistemic mod­esty [dis­til­la­tion]

Luise10 May 2023 9:44 UTC
16 points
2 comments9 min readLW link

You’re Cal­ling *Who* A Cult Leader?

Eliezer Yudkowsky22 Mar 2009 6:57 UTC
67 points
121 comments5 min readLW link

Our Phyg Is Not Ex­clu­sive Enough

[deleted]14 Apr 2012 21:08 UTC
43 points
518 comments3 min readLW link

Pu­tanu­monit: If ra­tio­nal­ity is a re­li­gion, it’s a crappy one.

Jacob Falkovich15 Apr 2017 16:44 UTC
8 points
0 comments1 min readLW link
(putanumonit.com)

[LINK] EA Has A Ly­ing Problem

Benquo11 Jan 2017 22:31 UTC
28 points
34 comments1 min readLW link
(srconstantin.wordpress.com)

Why I Am Not a Ra­tion­al­ist, or, why sev­eral of my friends warned me that this is a cult

Algernoq13 Jul 2014 17:54 UTC
19 points
193 comments3 min readLW link