a theory about why the rationalist community has trended a bit more right wing over time that ive considered for a while now, though i doubt im the first one to have this thought.
a lot of the community in the late 00s/early 2010s were drawn from internet atheist circles, like me. but the thing that was selected for there wasn’t nonbelief in god, or even skepticism qua skepticism, but something like, unsual amounts of irritation when one sees the dominant culture endorse a take that is obviously bad. at the time, the obviously bad but endorsed takes were things like “homosexuality is a sin and therefore bad”, “intelligent design”, and when christians refused to actually follow the teachings of jesus in terms of things like turning the other cheek and loving thy neighbours and not caring about the logs in their own eyes.
there will always be people who experience unusual amounts of irritation when they see the culture endorse (or passively accept) a take that is obviously bad, and this is great, because those people are great. but internet christians don’t really exist anymore? instead the obviously wrong things that most internet goers see by default are terrible strawmanny sjw takes: “IQ is a fake white supremacist notion”, “there are no biological differences between men and women”, “indigenous people get to do the blood and soil thing but no one else gets to do that for unexplained reasons”. so the people who show up now tend to be kinda mad about the sjws.
i am not saying that the sjw takes are unusually bad[1]; lots of other popular communities have even worse takes. but bad social justice takes are unusually endorsed by cultural gatekeepers, the way e.g. k-pop stans aren’t, and that’s the thing that lots of protorationalists really can’t stand.
after coming up with this theory, i became a lot less sad about the community becoming [edit: more] right wing. because it makes it a lot easier to believe that the new people are still my people in the most important ways. and it doesn’t seem unlikely to me that the bright eyed youngsters finding the community in 2030 would be irritated by and unusually fixiated on disproving an entirely different set of popular beliefs trendy in the culture by then.
actually, i think that the non-strawman versions of the sjw takes listed are all actually geninely really interesting and merit at least some consideration. ive been reading up on local indigenous history recently and it’s the most fascinating topic i’ve rabbit holed in on in ages.
I’m not persuaded that rationalists actually did turn towards the right. For example, when I looked at the proportion of people who identified as liberal/consistent for a few years sampled across the history of the LessWrong survey, the number seems consistent over time. Why do you think they did?
I agree that for a while, the main culture war rats engaged in was the anti-wokeism one, which made us look more right wing. But I don’t know if it e.g. led to more American rats voting Republican (my guess is that the proportion of rats voting Republican has in fact gone down over this time period because of Trump).
ah, i think i misspoke by saying “the community becoming right wing” in my original post. that is a strong overstatement, I’ll correct that.
i agree that rationalists are still very progressive, but i think there’s also been a noticeable but small rightward shift. some examples of what ive noticed outside of reflexive allergy responses to social justice posts:
increasing endorsement/linking of right wing figures like hanania and cremieux
at the same time, increasing… ~culture of silence? around certain controversial left-coded topics, eg what happened with nick decker
increased distrust in the government’s capacity to do things by default, more advocacy of pro-capitalist, free market and libertarian ideals. low confidence but i feel like i can kind of assume that the median rat has libertarian sympathies now in a way that i couldn’t before?
[from another comment i made] scott’s description of the grey tribe characterizes members as like, feeling vaguely annoyance that the issue of gay marriage even comes up, but because of the pronatalism it feels like re-litigation of abortion rights and gay acceptance are beginning to enter the community overton window again. to me, this feels like going further than reflexive anti-wokeism. meanwhile technological solutions seem to be somewhat sidelined[1].
i think a lot of the above examples are quite path dependent and im even sympathetic to some of their developments. im even fine if some would like to make the claim that these are all indicators of the community becoming more well-calibrated in a certain sense. but it does kind of seem like a real rightward trend to me?
i also don’t think this shift will result in significantly more rats voting republican, but that’s because i think voting republican is more of a signal of red tribe belonging than it is of like, actual political belief. one of my good friends from this community is a republican who hasn’t voted red in the presidential elections in ages.
sidelined in the discourse. individual people and organizations loosely affiliated with rationality are doing really cool things around reproductive tech, and this is of course much more important.
increasing endorsement/linking of right wing figures like hanania and cremieux
Idk, back in the day LessWrong had a reasonable amount of discussion of relatively right-wing figures like Moldbug and other neoreactionaries, or on the less extreme end, people like Bryan Caplan. And there’s always been an undercurrent of discussion of e.g. race and IQ.
low confidence but i feel like i can kind of assume that the median rat has libertarian sympathies now in a way that i couldn’t before?
I feel like the median rat had strong libertarian sympathies 10 years ago.
I think that shifting from 15% to 20% over ten years is so plausible under the null hypothesis that it doesn’t really cry out for explanation, and any proposed explanation has to somehow explain why it didn’t lead to a larger effect!
i think that the non-strawman versions of the sjw takes listed are all actually geninely really interesting and merit at least some consideration. ive been reading up on local indigenous history recently and it’s the most fascinating topic i’ve rabbit holed in on in ages.
I am interested in what/who you recommend reading here.
Rationalists turned towards the right because the left[1] became the outgroup, while the right[2] became the fargroup.
The above is somewhat glib but nonetheless true and important; see the classic Hanania article on what kinds of communities and epistemic bubbles the two sides create, and how the kind of anti-intellectualism of the right that would immediately turn rationalists off instead became an “out of sight, out of mind” type of deal.
Republicans still “threaten” me in the sense of being able to enact policies that harm me. And people less privileged than I am face even more threats – a person dependent on food stamps has a lot to fear from Republican victories. But Republicans aren’t taking over my social circle or screaming in my face. In a purely social context they start to seem more like cartoonish and distant figures of evil, rather than neighbors and coworkers. The average Trump voter no longer seems like an uncanny-valley version of me; they seem like some strange inhabitant of a far-off land with incomprehensible values, just like ISIS.
I don’t think this really tracks. I don’t think I’ve seen many people want to “become part of the political right”, and it’s not even the case that many people voted for republicans in recent elections (indeed, my guess is fewer rationalists voted for republicans in the last three elections than previous ones).
I do think it’s the case that on a decade scale people have become more anti-left. I think some of that is explained by background shift. Wokeness is on the decline, and anti-wokeness is more popular, so baserates are shifting. Additionally, people tend to be embedded in coastal left-leaning communities, so they develop antibodies against wokeness.
Maybe this is what you were saying, but “out of sight, out of mind” implies a miscalibration about attitudes on the right here, where my sense is people are mostly reasonably calibrated about anti-intellectualism on the right, but approximately no one was considering joining that part of the right, or was that threatened by it on a personal level, and so it doesn’t come up very much.
Hmm. I have no doubt you are more personally familiar with and knowledgeable of the rationality community than I am, especially when it comes to the in-person community, so I think it’s appropriate for me to defer here a fair bit.
Nevertheless, I think I still disagree to some extent, or at least remain confused on a few matters about the whole “miscalibration about attitudes on the right” thing. I linked a Wei Dai post upthread titled “Have epistemic conditions always been this bad?” which begins (emphasis mine):
In the last few months, I’ve gotten increasingly alarmed by leftist politics in the US, and the epistemic conditions that it operates under and is imposing wherever it gains power. (Quite possibly the conditions are just as dire on the right, but they are not as visible or salient to me, because most of the places I can easily see, either directly or through news stories, i.e., local politics in my area, academia, journalism, large corporations, seem to have been taken over by the left.)
I have not seen corresponding posts or comments on LW worrying about cancellations from the political right (or of targeted harrassment of orgs that collaborated with the Biden administration or other opponents of Trump, etc., as we are currently seeing in practice).
I also recall seeing several “the EA case for Trump” posts, the most popular of which was written by prominent LW user Richard Ngo, who predicted the Trump administration would listen to right-wing tech elites like Musk, Thiel, (especially!) Vivek etc. (“over the next 5–10 years Silicon Valley will become the core of the Republicans”) and reinvigorate institutions in Washington, cleansing them of the draconian censorship regimes, bureaucracies that strangle economies, and catastrophic monocultures. This… does not seem to have panned out, in any of the areas I’ve just mentioned. Others are analyzed here; my personal contribution is that I know several rats who are Hanania fans (and voted for Trump) were very surprised that Trump 2.0 was not a mere continuation of Trump 1.0 and instead turned very hostile to free trade and free markets.
(I did not see any corresponding “Rats for Harris” or “EAs for Harris” posts; maybe that’s a selection effect problem on my end?)
Moreover, many of the plans written last year on this very site for how the AI safety community should either reach out to the executive branch either to communicate issues about AI risk or try to get them to implement governance strategies, etc, seemed… not to engage with the reality of what having actual Donald Trump in power would mean in this respect? Or for example, they did not engage with the possibility of having David Sacks be the official US AI Czar and dismiss everything that’s not maximally supportive of AI and tech bros? Maybe AI governance people in their private conversations are adding in stuff like “and let’s make sure we personally give an expensive gift to Trump through his lackies when we meet with the agency, otherwise we’ll be dismissed outright,” but I’m not seeing public acknowledgements of how to deal with Trump being the president from those whose plans and desires route through the US executive taking bold international action when it comes to AI.
Also, very many (definitely a majority of) users on the EA Forum, and even top brass at GiveWell, seemed shocked and entirely unprepared when USAID was shut down. I don’t have all the links handy right now, but this certainly seems to reflect a failure to predict what the Trump administration would do, even though Project 2025 talked a fair bit about how to restructure and crack down on USAID. Perhaps you wouldn’t consider the EA and rationality communities to be the same, but the overlap seems quite substantial to me.
(I did not see any corresponding “Rats for Harris” or “EAs for Harris” posts; maybe that’s a selection effect problem on my end?)
Are you somehow implying the community isn’t extremely predominantly left? If I remember the stats correctly, for US rationalists, it’s like 60% democrats, 30% libertarians, <10% republicans. The reason why nobody wrote a “Rats for Harris” post is because that would be a very weird framing with the large majority of the community voting pretty stably democratic.
Almost the entirety of my most recent comment is just about the “rationalists were/weren’t miscalibrated about the anti-intellectualism etc of the Trump campaign.”
Trump is good at making people see whatever they want to see in him, even if it is different things for different people. That’s what makes him a successful politician.
Many rationalists enjoy uncritical contrarianism: they say things that defy common sense to signal how much smarter they are, and even if that’s not the way to make best predictions, it is a way to occasionally make a weird prediction that turns out to be correct, so you can be proud of it and conveniently forget many other similar predictions that turned out to be wrong.
So yeah, this is a bad combination, because no matter how much evidence we get, the game of pretending that everything Trump does is a 5D-chess move is too enjoyable. Trump does things; if some of them happen to be good, it is “I told you so”, and if some of them happen to be bad, it is “just wait, I am sure this is all a part of a greater plan”. But the only plan is to get more power for Trump; the consequences for the economy, society, education, science, etc. are mere side effects. Anyone who still doesn’t get it is too addicted to wishful thinking.
(I wonder about Project 2025. I don’t know the details, but it wouldn’t surprise me to find out that even its authors are disappointed by Trump. At least this review on EA Forum sounds to me much smarter and more coherent than anything that Trump administration actually did.)
huh, yeah, I think this is a pretty reasonable alternate hypothesis.
i do notice that there’s starting to be promising intellectual stuff coming from a right wing perspective again. i think this trend will continue and eventually there will be some enterprising zoomer publication that cracks the nut and gains genuine mainstream respectability as some sort of darling heterodox publication.
this would mean that even if the outgroup/fargroup distinction is the dominant force at play, it doesn’t indicate a permanent spiral towards right wing ideals in the community, as long as there continues to be new blood. it’s still all downstream of what’s going on in mainstream culture, yeah?
As further evidence for my position (and honestly also yours, they’re not necessarily in conflict), I bring up Wei Dai’s “Have epistemic conditions always been this bad?”, where he explains he has “gotten increasingly alarmed by leftist politics in the US, and the epistemic conditions that it operates under and is imposing wherever it gains power” but also mentions:
Quite possibly the conditions are just as dire on the right, but they are not as visible or salient to me, because most of the places I can easily see, either directly or through news stories, i.e., local politics in my area, academia, journalism, large corporations, seem to have been taken over by the left.
i do notice that there’s starting to be promising intellectual stuff coming from a right wing perspective again
Could you give me some references of what you’re talking about? I’d be very excited to read more about this. Most of what I’ve seen in terms of promising changes in the political sphere these days has been the long-overdue transition of the Democratic party mainstream to the Abundance agenda and the ideas long championed by Ezra Klein, Matt Yglesias, and Noah Smith, among others.
I’ve seen much less on the right, beyond stuff like Hanania’s Substack (which is very critical of the right these days). The IQ realignment seems real, with an ever-increasing share of Elite Human Capital moving to the left in the face of the Trump administration’s attacks on liberal ideals, constitutionalism, science funding, mainstream medical opinions (with the appointment of and cranky decisions taken by RFK Jr.), etc.
i think this trend will continue and eventually there will be some enterprising zoomer publication that cracks the nut and gains genuine mainstream respectability as some sort of darling heterodox publication
I’d love to be wrong about this, but I think it’s very unlikely this will actually happen. Modern epistemic conditions and thought bubbles seem to make the rise of genuine heterodoxy in the mainstream to be basically impossible. In modern times, the left requires ideological conformity[1] while the right demands personal loyalty.[2]
Heterodox organizations can only really float about in centrist waters, mostly populated by the center-left these days. The political left will demand too much agreement on issues like crime, immigration, transgender rights, rent control etc, for heterodoxy to be tolerated. And while political right embraces new blood of all kinds, that’s only if all criticism of the Trump administration is censored, preventing honest discourse on the most important political fights of this age.
What do you mean by “there’s starting to be promising intellectual stuff coming from a right wing perspective again?” I think what one means by “right wing” and “promising intellectual stuff” needs to be clarified.
Because if it’s just obvious stuff like “transgender women shouldn’t be allowed to compete in sports with cis women” or “iq isn’t bullshit” then I don’t know neither of those things seem very heterodox. But if it’s stuff like “gender affirming care doesn’t improve outcomes and shouldn’t be available to children” but said in a respectable way, that seems more “heterodox” but it also seems much more insidious.
If you’re a rich person in tech with high-status, a “darling right wing heterodox publication” could be a nice thing to read and discuss with your friends (ala “leisure of the theory class”). But if you have much less social power, a “darling right wing heterodox publication” looks like it will be one large “isn’t it wonderful to be a part of the genetically fortunate?” circlejerk.
Abstract intellectual discussions about liberty and security seem very interesting, but the audience for that stuff is very small. A substantial portion of “heterodox right-wing publications” look more like Thiel’s stuff at Stanford than they do the federalist papers.
A lot has to do with how what it means to be left/right has changed.
Rationalists usually don’t like following authorities. That was left-wing coded in late 00s/early 2010s and is more right-wing coded today.
I valued Glenn Greenwald political views two decades ago and I value them today. One all the issues that are most important to him, Glenn still holds the same views today as two decades ago. However, while Glenn was seen as clearly left-wing back then, he’s frequently seen as right-wing today.
Yeah, we need to distinguish between “someone had an opinion X, but changed to Y” from “someone’s opinion X was perceived as left-wing a decade ago, but is perceived as right-wing now”. And maybe also from “someone has always believed X, but expressing such belief could previously get them fired, so they kept quiet about it”.
To me it seems that my beliefs do not chance much recently (of course that may be a trick my brain plays on itself, when after updating it creates a false memory that I have always believed the new thing), it’s just then when I am surrounded by people who yell at me “IQ is a myth” and I disagree, they call me a right-winger, and when I am surrounded by people who yell at me “charity is stupid, let the poor people die” and I disagree, they call me a left-winger. So whatever people call me seems to me more of a fact about them then about me. (More precisely, all the things they call me, taken together, with the specific reasons why they called me that, that is about me. But which group happened to yell at me today, that is about the group.)
So when we say that “the rationalist community is recently a bit more right wing”, what specifically does it mean?
Also, we were already called right-wing in the past, are we really more right-wing today compared to back then when we had debates about neoreaction, or is this just an overreaction to some minor change that happened in the recent months?
tl;dr: step one is providing evidence that we are now more right-wing than e.g 10 years ago
step one is providing evidence that we are now more right-wing than e.g 10 years ago
honestly this is a pretty reasonable take.
my own experience is that it has, but this could have been for pretty idiosyncratic reasons. scott in his description of the grey tribe characterizes members as like, feeling vaguely annoyance that the issue of gay marriage even comes up, right? but because of the pronatalism it feels like fundamental rights to things like abortion and gay acceptance are being re-litigated in the community now (more specifically, the re-litigation has entered the overton window, not that it’s an active and ongoing debate), meanwhile technological solutions seem to be sidelined, and this has been quite dismaying for me.
I don’t know, the obviously wrong things you see on the internet seems to differ a lot based on your recommendation algorithm. The strawmanny sjw takes you list are mostly absent from my algorithm. In contrast, I see LOTS of absurd right-wing takes in my feed.
i don’t actually see strawmanny sjw takes either. my claim is that the default algorithms on large social media sites tends to expose most people to anti-sjw content.
a theory about why the rationalist community has trended a bit more right wing over time that ive considered for a while now, though i doubt im the first one to have this thought.
a lot of the community in the late 00s/early 2010s were drawn from internet atheist circles, like me. but the thing that was selected for there wasn’t nonbelief in god, or even skepticism qua skepticism, but something like, unsual amounts of irritation when one sees the dominant culture endorse a take that is obviously bad. at the time, the obviously bad but endorsed takes were things like “homosexuality is a sin and therefore bad”, “intelligent design”, and when christians refused to actually follow the teachings of jesus in terms of things like turning the other cheek and loving thy neighbours and not caring about the logs in their own eyes.
there will always be people who experience unusual amounts of irritation when they see the culture endorse (or passively accept) a take that is obviously bad, and this is great, because those people are great. but internet christians don’t really exist anymore? instead the obviously wrong things that most internet goers see by default are terrible strawmanny sjw takes: “IQ is a fake white supremacist notion”, “there are no biological differences between men and women”, “indigenous people get to do the blood and soil thing but no one else gets to do that for unexplained reasons”. so the people who show up now tend to be kinda mad about the sjws.
i am not saying that the sjw takes are unusually bad[1]; lots of other popular communities have even worse takes. but bad social justice takes are unusually endorsed by cultural gatekeepers, the way e.g. k-pop stans aren’t, and that’s the thing that lots of protorationalists really can’t stand.
after coming up with this theory, i became a lot less sad about the community becoming [edit: more] right wing. because it makes it a lot easier to believe that the new people are still my people in the most important ways. and it doesn’t seem unlikely to me that the bright eyed youngsters finding the community in 2030 would be irritated by and unusually fixiated on disproving an entirely different set of popular beliefs trendy in the culture by then.
actually, i think that the non-strawman versions of the sjw takes listed are all actually geninely really interesting and merit at least some consideration. ive been reading up on local indigenous history recently and it’s the most fascinating topic i’ve rabbit holed in on in ages.
I’m not persuaded that rationalists actually did turn towards the right. For example, when I looked at the proportion of people who identified as liberal/consistent for a few years sampled across the history of the LessWrong survey, the number seems consistent over time. Why do you think they did?
I agree that for a while, the main culture war rats engaged in was the anti-wokeism one, which made us look more right wing. But I don’t know if it e.g. led to more American rats voting Republican (my guess is that the proportion of rats voting Republican has in fact gone down over this time period because of Trump).
ah, i think i misspoke by saying “the community becoming right wing” in my original post. that is a strong overstatement, I’ll correct that.
i agree that rationalists are still very progressive, but i think there’s also been a noticeable but small rightward shift. some examples of what ive noticed outside of reflexive allergy responses to social justice posts:
increasing endorsement/linking of right wing figures like hanania and cremieux
at the same time, increasing… ~culture of silence? around certain controversial left-coded topics, eg what happened with nick decker
increased distrust in the government’s capacity to do things by default, more advocacy of pro-capitalist, free market and libertarian ideals. low confidence but i feel like i can kind of assume that the median rat has libertarian sympathies now in a way that i couldn’t before?
[from another comment i made] scott’s description of the grey tribe characterizes members as like, feeling vaguely annoyance that the issue of gay marriage even comes up, but because of the pronatalism it feels like re-litigation of abortion rights and gay acceptance are beginning to enter the community overton window again. to me, this feels like going further than reflexive anti-wokeism. meanwhile technological solutions seem to be somewhat sidelined[1].
i think a lot of the above examples are quite path dependent and im even sympathetic to some of their developments. im even fine if some would like to make the claim that these are all indicators of the community becoming more well-calibrated in a certain sense. but it does kind of seem like a real rightward trend to me?
i also don’t think this shift will result in significantly more rats voting republican, but that’s because i think voting republican is more of a signal of red tribe belonging than it is of like, actual political belief. one of my good friends from this community is a republican who hasn’t voted red in the presidential elections in ages.
sidelined in the discourse. individual people and organizations loosely affiliated with rationality are doing really cool things around reproductive tech, and this is of course much more important.
Idk, back in the day LessWrong had a reasonable amount of discussion of relatively right-wing figures like Moldbug and other neoreactionaries, or on the less extreme end, people like Bryan Caplan. And there’s always been an undercurrent of discussion of e.g. race and IQ.
I feel like the median rat had strong libertarian sympathies 10 years ago.
i think these facts can be consistent with a theory like, the rationalists went from being 15% right wing to 20% right wing in the last ten years?
I think that shifting from 15% to 20% over ten years is so plausible under the null hypothesis that it doesn’t really cry out for explanation, and any proposed explanation has to somehow explain why it didn’t lead to a larger effect!
Every so often I stumble across a question in need of a survey, and as it happens I have one.
(Smaller response rate than I’d like though, I should try that chart on the ACX survey sometime.)
I am interested in what/who you recommend reading here.
Rationalists turned towards the right because the left[1] became the outgroup, while the right[2] became the fargroup.
The above is somewhat glib but nonetheless true and important; see the classic Hanania article on what kinds of communities and epistemic bubbles the two sides create, and how the kind of anti-intellectualism of the right that would immediately turn rationalists off instead became an “out of sight, out of mind” type of deal.
Also, see this (from Scott):
The cultural left, more specifically; the kinds of people trying to cancel Scott Alexander over the culture war thread, for instance
The rank-and-file right, more specifically, i.e., >90% of the actual Trump base
I don’t think this really tracks. I don’t think I’ve seen many people want to “become part of the political right”, and it’s not even the case that many people voted for republicans in recent elections (indeed, my guess is fewer rationalists voted for republicans in the last three elections than previous ones).
I do think it’s the case that on a decade scale people have become more anti-left. I think some of that is explained by background shift. Wokeness is on the decline, and anti-wokeness is more popular, so baserates are shifting. Additionally, people tend to be embedded in coastal left-leaning communities, so they develop antibodies against wokeness.
Maybe this is what you were saying, but “out of sight, out of mind” implies a miscalibration about attitudes on the right here, where my sense is people are mostly reasonably calibrated about anti-intellectualism on the right, but approximately no one was considering joining that part of the right, or was that threatened by it on a personal level, and so it doesn’t come up very much.
Hmm. I have no doubt you are more personally familiar with and knowledgeable of the rationality community than I am, especially when it comes to the in-person community, so I think it’s appropriate for me to defer here a fair bit.
Nevertheless, I think I still disagree to some extent, or at least remain confused on a few matters about the whole “miscalibration about attitudes on the right” thing. I linked a Wei Dai post upthread titled “Have epistemic conditions always been this bad?” which begins (emphasis mine):
I have not seen corresponding posts or comments on LW worrying about cancellations from the political right (or of targeted harrassment of orgs that collaborated with the Biden administration or other opponents of Trump, etc., as we are currently seeing in practice).
I also recall seeing several “the EA case for Trump” posts, the most popular of which was written by prominent LW user Richard Ngo, who predicted the Trump administration would listen to right-wing tech elites like Musk, Thiel, (especially!) Vivek etc. (“over the next 5–10 years Silicon Valley will become the core of the Republicans”) and reinvigorate institutions in Washington, cleansing them of the draconian censorship regimes, bureaucracies that strangle economies, and catastrophic monocultures. This… does not seem to have panned out, in any of the areas I’ve just mentioned. Others are analyzed here; my personal contribution is that I know several rats who are Hanania fans (and voted for Trump) were very surprised that Trump 2.0 was not a mere continuation of Trump 1.0 and instead turned very hostile to free trade and free markets.
(I did not see any corresponding “Rats for Harris” or “EAs for Harris” posts; maybe that’s a selection effect problem on my end?)
Moreover, many of the plans written last year on this very site for how the AI safety community should either reach out to the executive branch either to communicate issues about AI risk or try to get them to implement governance strategies, etc, seemed… not to engage with the reality of what having actual Donald Trump in power would mean in this respect? Or for example, they did not engage with the possibility of having David Sacks be the official US AI Czar and dismiss everything that’s not maximally supportive of AI and tech bros? Maybe AI governance people in their private conversations are adding in stuff like “and let’s make sure we personally give an expensive gift to Trump through his lackies when we meet with the agency, otherwise we’ll be dismissed outright,” but I’m not seeing public acknowledgements of how to deal with Trump being the president from those whose plans and desires route through the US executive taking bold international action when it comes to AI.
Also, very many (definitely a majority of) users on the EA Forum, and even top brass at GiveWell, seemed shocked and entirely unprepared when USAID was shut down. I don’t have all the links handy right now, but this certainly seems to reflect a failure to predict what the Trump administration would do, even though Project 2025 talked a fair bit about how to restructure and crack down on USAID. Perhaps you wouldn’t consider the EA and rationality communities to be the same, but the overlap seems quite substantial to me.
Are you somehow implying the community isn’t extremely predominantly left? If I remember the stats correctly, for US rationalists, it’s like 60% democrats, 30% libertarians, <10% republicans. The reason why nobody wrote a “Rats for Harris” post is because that would be a very weird framing with the large majority of the community voting pretty stably democratic.
Almost the entirety of my most recent comment is just about the “rationalists were/weren’t miscalibrated about the anti-intellectualism etc of the Trump campaign.”
Trump is good at making people see whatever they want to see in him, even if it is different things for different people. That’s what makes him a successful politician.
Many rationalists enjoy uncritical contrarianism: they say things that defy common sense to signal how much smarter they are, and even if that’s not the way to make best predictions, it is a way to occasionally make a weird prediction that turns out to be correct, so you can be proud of it and conveniently forget many other similar predictions that turned out to be wrong.
So yeah, this is a bad combination, because no matter how much evidence we get, the game of pretending that everything Trump does is a 5D-chess move is too enjoyable. Trump does things; if some of them happen to be good, it is “I told you so”, and if some of them happen to be bad, it is “just wait, I am sure this is all a part of a greater plan”. But the only plan is to get more power for Trump; the consequences for the economy, society, education, science, etc. are mere side effects. Anyone who still doesn’t get it is too addicted to wishful thinking.
(I wonder about Project 2025. I don’t know the details, but it wouldn’t surprise me to find out that even its authors are disappointed by Trump. At least this review on EA Forum sounds to me much smarter and more coherent than anything that Trump administration actually did.)
huh, yeah, I think this is a pretty reasonable alternate hypothesis.
i do notice that there’s starting to be promising intellectual stuff coming from a right wing perspective again. i think this trend will continue and eventually there will be some enterprising zoomer publication that cracks the nut and gains genuine mainstream respectability as some sort of darling heterodox publication.
this would mean that even if the outgroup/fargroup distinction is the dominant force at play, it doesn’t indicate a permanent spiral towards right wing ideals in the community, as long as there continues to be new blood. it’s still all downstream of what’s going on in mainstream culture, yeah?
As further evidence for my position (and honestly also yours, they’re not necessarily in conflict), I bring up Wei Dai’s “Have epistemic conditions always been this bad?”, where he explains he has “gotten increasingly alarmed by leftist politics in the US, and the epistemic conditions that it operates under and is imposing wherever it gains power” but also mentions:
Could you give me some references of what you’re talking about? I’d be very excited to read more about this. Most of what I’ve seen in terms of promising changes in the political sphere these days has been the long-overdue transition of the Democratic party mainstream to the Abundance agenda and the ideas long championed by Ezra Klein, Matt Yglesias, and Noah Smith, among others.
I’ve seen much less on the right, beyond stuff like Hanania’s Substack (which is very critical of the right these days). The IQ realignment seems real, with an ever-increasing share of Elite Human Capital moving to the left in the face of the Trump administration’s attacks on liberal ideals, constitutionalism, science funding, mainstream medical opinions (with the appointment of and cranky decisions taken by RFK Jr.), etc.
I’d love to be wrong about this, but I think it’s very unlikely this will actually happen. Modern epistemic conditions and thought bubbles seem to make the rise of genuine heterodoxy in the mainstream to be basically impossible. In modern times, the left requires ideological conformity[1] while the right demands personal loyalty.[2]
Heterodox organizations can only really float about in centrist waters, mostly populated by the center-left these days. The political left will demand too much agreement on issues like crime, immigration, transgender rights, rent control etc, for heterodoxy to be tolerated. And while political right embraces new blood of all kinds, that’s only if all criticism of the Trump administration is censored, preventing honest discourse on the most important political fights of this age.
I.e., agreement with all leftist or progressive views espoused by The Groups
I.e., subservient fealty to Donald Trump
What do you mean by “there’s starting to be promising intellectual stuff coming from a right wing perspective again?” I think what one means by “right wing” and “promising intellectual stuff” needs to be clarified.
Because if it’s just obvious stuff like “transgender women shouldn’t be allowed to compete in sports with cis women” or “iq isn’t bullshit” then I don’t know neither of those things seem very heterodox. But if it’s stuff like “gender affirming care doesn’t improve outcomes and shouldn’t be available to children” but said in a respectable way, that seems more “heterodox” but it also seems much more insidious.
If you’re a rich person in tech with high-status, a “darling right wing heterodox publication” could be a nice thing to read and discuss with your friends (ala “leisure of the theory class”). But if you have much less social power, a “darling right wing heterodox publication” looks like it will be one large “isn’t it wonderful to be a part of the genetically fortunate?” circlejerk.
Abstract intellectual discussions about liberty and security seem very interesting, but the audience for that stuff is very small. A substantial portion of “heterodox right-wing publications” look more like Thiel’s stuff at Stanford than they do the federalist papers.
A lot has to do with how what it means to be left/right has changed.
Rationalists usually don’t like following authorities. That was left-wing coded in late 00s/early 2010s and is more right-wing coded today.
I valued Glenn Greenwald political views two decades ago and I value them today. One all the issues that are most important to him, Glenn still holds the same views today as two decades ago. However, while Glenn was seen as clearly left-wing back then, he’s frequently seen as right-wing today.
Yeah, we need to distinguish between “someone had an opinion X, but changed to Y” from “someone’s opinion X was perceived as left-wing a decade ago, but is perceived as right-wing now”. And maybe also from “someone has always believed X, but expressing such belief could previously get them fired, so they kept quiet about it”.
To me it seems that my beliefs do not chance much recently (of course that may be a trick my brain plays on itself, when after updating it creates a false memory that I have always believed the new thing), it’s just then when I am surrounded by people who yell at me “IQ is a myth” and I disagree, they call me a right-winger, and when I am surrounded by people who yell at me “charity is stupid, let the poor people die” and I disagree, they call me a left-winger. So whatever people call me seems to me more of a fact about them then about me. (More precisely, all the things they call me, taken together, with the specific reasons why they called me that, that is about me. But which group happened to yell at me today, that is about the group.)
So when we say that “the rationalist community is recently a bit more right wing”, what specifically does it mean?
Also, we were already called right-wing in the past, are we really more right-wing today compared to back then when we had debates about neoreaction, or is this just an overreaction to some minor change that happened in the recent months?
tl;dr: step one is providing evidence that we are now more right-wing than e.g 10 years ago
honestly this is a pretty reasonable take.
my own experience is that it has, but this could have been for pretty idiosyncratic reasons. scott in his description of the grey tribe characterizes members as like, feeling vaguely annoyance that the issue of gay marriage even comes up, right? but because of the pronatalism it feels like fundamental rights to things like abortion and gay acceptance are being re-litigated in the community now (more specifically, the re-litigation has entered the overton window, not that it’s an active and ongoing debate), meanwhile technological solutions seem to be sidelined, and this has been quite dismaying for me.
I don’t know, the obviously wrong things you see on the internet seems to differ a lot based on your recommendation algorithm. The strawmanny sjw takes you list are mostly absent from my algorithm. In contrast, I see LOTS of absurd right-wing takes in my feed.
i don’t actually see strawmanny sjw takes either. my claim is that the default algorithms on large social media sites tends to expose most people to anti-sjw content.
I see. Why do you have this impression that the default algorithms would do this? Genuinely asking, since I haven’t seen convincing evidence of this.