Can a reasonable Wikipedia editor take a stab at editing the “Rationalist Community” Wikipedia page into something normal? It appears to be edited by the usual RationalWiki crowd, who have previously been banned from editing Wikipedia articles in the space due to insane levels of bias.
I don’t want to edit myself because of COI, but I am sure there are many people out there who can do a reasonable job. The page currently says inane things like:
Rationalists are concerned with applying Bayesian inference to understand the world as it really is, avoiding cognitive biases, emotionality, or political correctness.
or:
The movement connected to the founder culture of Silicon Valley and its faith in the power of intelligent capitalists and technocrats to create widespread prosperity.[8][9]
Or completely inane things like:
Though this attitude is based on “the view that vile ideas should be countenanced and refuted rather than left to accrue the status of forbidden knowledge”,[19] rationalists also hold the view that other ideas, referred to as information hazards, are dangerous and should be suppressed.[20] Roko’s Basilisk and the writings of Ziz LaSota are commonly cited information hazards among rationalists.[17]
It’s obviously not an article that’s up to Wikipedia’s standards.
The LessWrong article used to be similarly horrendous, but was eventually transformed into something kind of reasonable (though still not great). Looking through the archived talk pages for that should give a good sense of what kind of policies apply, as well as a bunch of good sources.
I wonder what is the optimal reaction to situations like that. My first idea is to collectively prepare a response at Less Wrong, which could then be posted on the article talk page. The response would be relatively brief, list the factual errors, and optionally propose suggestions along with references.
Collectively, because it will be easier for Wikipedia editors to engage with one summary input from our community, rather that several people making partial comments independently. Also, because the quality of the response could be higher if e.g. someone notices an error, someone else finds a reference supporting the complaint, and maybe another person helps to make the entire argument more compatible with the Wikipedia rules.
Also, someone may be wrong about something, or something can be ambiguous. Like, I keep wondering about the statement that the rationality community formed around LW and SSC. LW, sure. But Scott was posting on LW since 2009, and when he started SSC in 2013, I would say the rationality community had already been formed, albeit much smaller than it is now. SSC as a separate blog actually attracted non-rationalist audience to Scott’s writing, and Scott often posted there things that wouldn’t fit on LW back then, such as jokes and fiction. And even today, I think that only a minority of ACX readers identifies as aspiring rationalists. More often, they make fun of rationalists.
It took me a while to figure out that “common interests include statistics” probably refers to Bayesianism. At least I think so. Isn’t it weird that I am not sure about one of our most important common interests?
“CFAR teaches courses based on HPMOR”; I think the causality is probably in the opposite direction.
“rationalists also hold the view that [...] information hazards, are dangerous and should be suppressed. [...] the writings of Ziz LaSota are commonly cited information hazards among rationalists”; when you put it together like this, that strongly suggests that rationalists believe that Ziz’s blog should be suppressed, but I have never heard such proposal.
I find it interesting that post-rationalists are described as people who perceive the rationality community as cult-like, when my impression was that original objection was about the community not paying sufficient respect to ancient wisdom, especially religion.
(Let me guess, this is going to be linked from Wikipedia as “Viliam proposes brigading, be very careful and during the next 100 days revert all changes to the page, and make sure to lock the talk page”.)
EDIT: I am curious how Wikipedia editors decide who is and who isn’t a member of the rationalist community. For example, Zizians are referred to as rationalists (not “ex-rationalists”), so… once a rationalist, always a rationalist?
EDIT: I am curious how Wikipedia editors decide who is and who isn’t a member of the rationalist community. For example, Zizians are referred to as rationalists (not “ex-rationalists”), so… once a rationalist, always a rationalist?
I see a subtle distinction there, between “a member of the rationalist community” and “a rationalist”.
I would say the latter is “someone who has thinking strategies and acting strategies that enable them to have more beneficial and complex things”—or, for more verifiability, “someone whose thinking&acting strategies are worth copying”. Using the second definition, I would not claim nor disclaim being a rationalist because my strategies are mostly [native code] which cannot be copied so easily. In any case, it is not possible to disavow someone being a rationalist because that statement is mostly about them.
The former, “a member of the rationalist community”, is essentially “someone who keeps in contact with the specific community, exchanges ideas, favors and so on”. That is possible to “excommunicate”.
Thinking/acting style: “mainstream rationality” or “x-rationality”.
Social behavior: ignores the LW community entirely, reads the website, posts on the website, attends meetups, meet other rationalists even outside meetups, lives in a group house.
Identity: identifies as a “rationalist”, or just “someone who hangs out with rationalists, but is not one of them”.
And even this is not clear. Using Zizians as an example, they are clearly inspired by some memes in the LW community, but they also clearly reject some other memes (such as ethical injunctions), are they “x-rationalists” by thinking style? They used to live in the Bay Area and recruit among the rationalists, but they also protested against MIRI and CFAR, were they members of the community at that moment? No idea whether they identified as “rationalists” or whatever else.
The Zizians are a small, renegade, spin-off group with an ideological emphasis on veganism and anarchism, which became well known in 2025 for being suspected of involvement in four murders. The Zizians originally formed around the Bay Area rationalist community, but became disillusioned with other rationalist organizations and leaders. Among the Zizians’ accusations against them were anti-transgender discrimination, misuse of donor funds to pay off a sexual misconduct accuser, and not valuing animal welfare in plans for human-friendly AI.
I am actually quite okay with this. It mentions the important things: “spin-off group” (i.e. their membership is a history), “veganism and anarchism” (their motivations other than rationalism). The only way I can imagine it better from my perspective would be to add more years to make it clear that their participation in the community was 2014-2019, and the murders 2022-2025 (i.e. no overlap).
The part I don’t like is the introduction to the “Zizians” article, which starts with:
The Zizians are an informal group of rationalists with anarchist and vegan beliefs
With the word “rationalists” pointing to the “Rationalist community” article. You see the rhetorical trick: anarchism and veganism are their beliefs, but rationalists is what they are. The sentence does not claim explicitly that they are members of the community (as opposed to just someone trying to be more rational), but that’s where the hyperlink points at. Also, the present tense.
This all is a spin; one could equally validly say e.g. “Zizians are an informal group of anarchist vegans who have met each other during the years they spent in the rationality community.”
Ctrl+F TESCREAL … of course it is there. It is a thing that doesn’t even exist, but of course Wikipedia mentions it.
Oliver, the fact that you even mentioned this is considered “canvassing” (a word I didn’t even know existed) and is apparently against the rules of Wikipedia.
Wikipedia defines canvassing as notifying other editors of ongoing discussions with the intention of influencing the outcome. It is considered inappropriate, because it compromises the normal consensus making process. The proper ways to do that are:
if you complain about a specific editor, you can do it on their talk page
Make sure to be polite, neutral and brief.
Of course this is now used as an excuse to revert any recent attempts to improve the article.
I guess the lesson is that the next time you complain about what a horrible mess some Wikipedia article is, you must refrain from explicitly suggesting that anyone improve it. It is important to follow
I don’t think it counts as canvassing in the relevant sense, as I didn’t express any specific opinion on how the article should be edited. I think maybe you could argue I did vote-stacking, but I think the argument is kind of weak.
Tracing Woodgrain’s post did just successfully fix a bunch of articles. I used to be more hesitant about this, but I think de facto you somehow need to draw attention to when an article needs to be improved, and posting publicly about it is more within the spirit of WP:CANVAS than anything else I actually expect to work (Wikipedia editors with more experience on the issue should raise things however they are supposed to on WP, including posting wherever is appropriate on internal WP boards).
Of course this is now used as an excuse to revert any recent attempts to improve the article.
From reading the relevant talk-page it is pretty clear those arguing against the changes on these bases aren’t exactly doing so in good faith, and if they did not have this bit of ammunition to use they would use something else, but then with fewer detractors (since clearly nobody else followed or cared about that page).
I remember editing a abrasive sentence on there few months ago:
Members of the rationalist community believe only a small number of people, namely including themselves, have the unique abilities knowledge and skill required to reduce the probability of human extinction
Regardless of the accuracy of this statement[1] previous characterisation was a bit too unhinged, and was conveying a pompous picture of the rationality community.
The current version of the page seems to have gone even further on the snark. I had this discussion on bayesian conspiracy discord few months ago, a bunch of people on there thought this would be hard to improve because wikipedia only allows reputable sources and capital R rationality community is worse at rhetoric. Although I am not sure it’s possible there are plethora of news articles—which I didn’t find in my preliminary searches few months ago— praising Rationality community in the mainstream media but due to general negativity bias, and how I only got exposed to strawmans of lesswrong adjacent community when I first found out about it last year—which made me reluctant and paranoid to use this website for months, thanks to @David_Gerard — makes me less optimistic in that as a prior.
Also why does Rational wiki hate LW so much? What is the source of all that animosity?
David Gerard, one of the founders of RationalWiki, actually used to hang out here a lot. I think a bunch of people gave him negative feedback, he got into a dispute with a moderator, and seems to have walked away with a grudge to try to do everything within his power to destroy LessWrong. See the Tracing Woodgrain’s post for a bunch of the history.
Also why does Rational wiki hate LW so much? What is the source of all that animosity?
Reply
I am not too familiar with RationalWiki but my impression is the editors come from a certain mindset where you always disbelieve anything that sounds weird, and LWers talk about a lot of weird stuff, which to them falls in the same bucket as religion / woo / pseudoscience. And I would think they especially dislike people calling themselves “rationalists” when in actuality they’re just doing woo / pseudoscience.
If you’re AI-pilled enough you can also build fact checking and search functionality on top. o3 can see through the lies. I don’t think most of humanity is going to rely on Wikipedia editors for access to ground truth for very long.
@habryka I mean readership of Wikipedia is going to go down if someone builds a better website to replace it. Wikipedia + community-notes-like-voting is an example. So you can build this instead.
Can it see through the stereotypes too? From what I saw (though I used Grok for this test and that might be a relevant factor), LLMs are nowhere near a guess that LW might discuss parenting, or interior design, and instead devise more and more specific fields to be intersected with rationality.
Try again, and now guess #39,#40,#41 topics by discussion amount on LessWrong, and now you are allowed to think explicitly of top thirty eight if you wish so.
... #39: Rationalist Approaches to Understanding and Managing Complex Systems... #40: The Ethics and Implications of Quantum Computing... #41: Rationality in Interpersonal Relationships and Communication...
Can a reasonable Wikipedia editor take a stab at editing the “Rationalist Community” Wikipedia page into something normal? It appears to be edited by the usual RationalWiki crowd, who have previously been banned from editing Wikipedia articles in the space due to insane levels of bias.
I don’t want to edit myself because of COI, but I am sure there are many people out there who can do a reasonable job. The page currently says inane things like:
or:
Or completely inane things like:
It’s obviously not an article that’s up to Wikipedia’s standards.
If you want some context on the history of the editors in the space: https://www.tracingwoodgrains.com/p/reliable-sources-how-wikipedia-admin
The LessWrong article used to be similarly horrendous, but was eventually transformed into something kind of reasonable (though still not great). Looking through the archived talk pages for that should give a good sense of what kind of policies apply, as well as a bunch of good sources.
I wonder what is the optimal reaction to situations like that. My first idea is to collectively prepare a response at Less Wrong, which could then be posted on the article talk page. The response would be relatively brief, list the factual errors, and optionally propose suggestions along with references.
Collectively, because it will be easier for Wikipedia editors to engage with one summary input from our community, rather that several people making partial comments independently. Also, because the quality of the response could be higher if e.g. someone notices an error, someone else finds a reference supporting the complaint, and maybe another person helps to make the entire argument more compatible with the Wikipedia rules.
Also, someone may be wrong about something, or something can be ambiguous. Like, I keep wondering about the statement that the rationality community formed around LW and SSC. LW, sure. But Scott was posting on LW since 2009, and when he started SSC in 2013, I would say the rationality community had already been formed, albeit much smaller than it is now. SSC as a separate blog actually attracted non-rationalist audience to Scott’s writing, and Scott often posted there things that wouldn’t fit on LW back then, such as jokes and fiction. And even today, I think that only a minority of ACX readers identifies as aspiring rationalists. More often, they make fun of rationalists.
It took me a while to figure out that “common interests include statistics” probably refers to Bayesianism. At least I think so. Isn’t it weird that I am not sure about one of our most important common interests?
“CFAR teaches courses based on HPMOR”; I think the causality is probably in the opposite direction.
“rationalists also hold the view that [...] information hazards, are dangerous and should be suppressed. [...] the writings of Ziz LaSota are commonly cited information hazards among rationalists”; when you put it together like this, that strongly suggests that rationalists believe that Ziz’s blog should be suppressed, but I have never heard such proposal.
I find it interesting that post-rationalists are described as people who perceive the rationality community as cult-like, when my impression was that original objection was about the community not paying sufficient respect to ancient wisdom, especially religion.
(Let me guess, this is going to be linked from Wikipedia as “Viliam proposes brigading, be very careful and during the next 100 days revert all changes to the page, and make sure to lock the talk page”.)
EDIT: I am curious how Wikipedia editors decide who is and who isn’t a member of the rationalist community. For example, Zizians are referred to as rationalists (not “ex-rationalists”), so… once a rationalist, always a rationalist?
I see a subtle distinction there, between “a member of the rationalist community” and “a rationalist”.
I would say the latter is “someone who has thinking strategies and acting strategies that enable them to have more beneficial and complex things”—or, for more verifiability, “someone whose thinking&acting strategies are worth copying”. Using the second definition, I would not claim nor disclaim being a rationalist because my strategies are mostly [native code] which cannot be copied so easily. In any case, it is not possible to disavow someone being a rationalist because that statement is mostly about them.
The former, “a member of the rationalist community”, is essentially “someone who keeps in contact with the specific community, exchanges ideas, favors and so on”. That is possible to “excommunicate”.
Even more distinctions are possible...
Thinking/acting style: “mainstream rationality” or “x-rationality”.
Social behavior: ignores the LW community entirely, reads the website, posts on the website, attends meetups, meet other rationalists even outside meetups, lives in a group house.
Identity: identifies as a “rationalist”, or just “someone who hangs out with rationalists, but is not one of them”.
And even this is not clear. Using Zizians as an example, they are clearly inspired by some memes in the LW community, but they also clearly reject some other memes (such as ethical injunctions), are they “x-rationalists” by thinking style? They used to live in the Bay Area and recruit among the rationalists, but they also protested against MIRI and CFAR, were they members of the community at that moment? No idea whether they identified as “rationalists” or whatever else.
I am actually quite okay with this. It mentions the important things: “spin-off group” (i.e. their membership is a history), “veganism and anarchism” (their motivations other than rationalism). The only way I can imagine it better from my perspective would be to add more years to make it clear that their participation in the community was 2014-2019, and the murders 2022-2025 (i.e. no overlap).
The part I don’t like is the introduction to the “Zizians” article, which starts with:
With the word “rationalists” pointing to the “Rationalist community” article. You see the rhetorical trick: anarchism and veganism are their beliefs, but rationalists is what they are. The sentence does not claim explicitly that they are members of the community (as opposed to just someone trying to be more rational), but that’s where the hyperlink points at. Also, the present tense.
This all is a spin; one could equally validly say e.g. “Zizians are an informal group of anarchist vegans who have met each other during the years they spent in the rationality community.”
Ctrl+F TESCREAL … of course it is there. It is a thing that doesn’t even exist, but of course Wikipedia mentions it.
Oliver, the fact that you even mentioned this is considered “canvassing” (a word I didn’t even know existed) and is apparently against the rules of Wikipedia.
Wikipedia defines canvassing as notifying other editors of ongoing discussions with the intention of influencing the outcome. It is considered inappropriate, because it compromises the normal consensus making process. The proper ways to do that are:
talk page or noticeboard of a related WikiProject (e.g. WikiProject Effective Altruism)
central place such as Village Pump
if you complain about a specific editor, you can do it on their talk page
Make sure to be polite, neutral and brief.
Of course this is now used as an excuse to revert any recent attempts to improve the article.
I guess the lesson is that the next time you complain about what a horrible mess some Wikipedia article is, you must refrain from explicitly suggesting that anyone improve it. It is important to follow
I don’t think it counts as canvassing in the relevant sense, as I didn’t express any specific opinion on how the article should be edited. I think maybe you could argue I did vote-stacking, but I think the argument is kind of weak.
Tracing Woodgrain’s post did just successfully fix a bunch of articles. I used to be more hesitant about this, but I think de facto you somehow need to draw attention to when an article needs to be improved, and posting publicly about it is more within the spirit of WP:CANVAS than anything else I actually expect to work (Wikipedia editors with more experience on the issue should raise things however they are supposed to on WP, including posting wherever is appropriate on internal WP boards).
From reading the relevant talk-page it is pretty clear those arguing against the changes on these bases aren’t exactly doing so in good faith, and if they did not have this bit of ammunition to use they would use something else, but then with fewer detractors (since clearly nobody else followed or cared about that page).
I remember editing a abrasive sentence on there few months ago:
Regardless of the accuracy of this statement[1] previous characterisation was a bit too unhinged, and was conveying a pompous picture of the rationality community.
The current version of the page seems to have gone even further on the snark. I had this discussion on bayesian conspiracy discord few months ago, a bunch of people on there thought this would be hard to improve because wikipedia only allows reputable sources and capital R rationality community is worse at rhetoric. Although I am not sure it’s possible there are plethora of news articles—which I didn’t find in my preliminary searches few months ago— praising Rationality community in the mainstream media but due to general negativity bias, and how I only got exposed to strawmans of lesswrong adjacent community when I first found out about it last year—which made me reluctant and paranoid to use this website for months, thanks to @David_Gerard — makes me less optimistic in that as a prior.
I do think only a small number of people may have the knowledge and skills to do AI alignment.
I recently noticed just how bad LW’s reputation is outside of the community.
It is like reading the description of an alternative reality LW made of far right crancks and r/athesim mods.
Also why does Rational wiki hate LW so much? What is the source of all that animosity?
David Gerard, one of the founders of RationalWiki, actually used to hang out here a lot. I think a bunch of people gave him negative feedback, he got into a dispute with a moderator, and seems to have walked away with a grudge to try to do everything within his power to destroy LessWrong. See the Tracing Woodgrain’s post for a bunch of the history.
I am not too familiar with RationalWiki but my impression is the editors come from a certain mindset where you always disbelieve anything that sounds weird, and LWers talk about a lot of weird stuff, which to them falls in the same bucket as religion / woo / pseudoscience. And I would think they especially dislike people calling themselves “rationalists” when in actuality they’re just doing woo / pseudoscience.
If you’re AI-pilled enough you can also build fact checking and search functionality on top. o3 can see through the lies. I don’t think most of humanity is going to rely on Wikipedia editors for access to ground truth for very long.
@habryka I mean readership of Wikipedia is going to go down if someone builds a better website to replace it. Wikipedia + community-notes-like-voting is an example. So you can build this instead.
Can it see through the stereotypes too? From what I saw (though I used Grok for this test and that might be a relevant factor), LLMs are nowhere near a guess that LW might discuss parenting, or interior design, and instead devise more and more specific fields to be intersected with rationality.