Rationality/EA basically is a religion already, no?
Gordon Seidoh Worley:
No, or so say I.
I prefer not to adjudicate this on some formal basis. There are several attempts by academics to define religion, but I think it’s better to ask “does rationality or EA look sufficiently like other things that are definitely religions that we should call them religions”.
I say “no” on the basis of a few factors: …
Said Achmiz:
EA is [a religion]
Gordon Seidoh Worley:
Why do you think EA is a religion? I disagree in a sibling comment.
Said Achmiz:
… the view is hardly unique to me, nor even original to me; “EA is a religion” is something that I’ve seen quite a few people opine. Haven’t you encountered this view before? I am surprised, if that’s the case.
Gordon Seidoh Worley:
Most of the time when I hear people say “EA is a religion” it’s because they are trying to discredit EA without actually engaging with EA, so I was honestly curious what you could mean here since it seems, to me, a claim on par with people calling rationalists a cult.
Some things I’d consider religions but traditional definitions don’t always include. Note that for some of these some people do these things as religions and others don’t.
Sports fans. They dedicate themselves to their team, engage in ritual behaviors to demonstrate support for their team or try to bring about victory, and they get enjoyment out of rooting for their team alongside their fellow fans.
Early-stage startups. People who join early startups are dedicated to making something happen (whatever thing it is the startup is building), engage in rituals like stand-ups and commuting and code review to bring a product into existence, and they like doing it, especially since they expect to gain large rewards if they succeed.
Effective altruism. People who identify as EAs care about doing the most good, they believe weird things like expected value calculations are better indicators of what’s most good than feelings, they commit themselves to things like earning to give or working on projects that don’t pay well but that they believe will have outsized impacts, and they honestly believe that the world will be better off because they did more good by applying the methods of effective altruism.
Why such an expansive definition? Because it’s my belief that humans are naturally religious. Our minds are structured to be part of a culture with certain features and the way you get those features is by creating religions in approximately the sense I mean here.
(Emphasis mine.)
So, my answer to Gordon, who was “honestly curious what [I] could mean here”, is “read your own post from two years ago (where you say exactly the thing that I just said and which you said was ‘a claim on par with people calling rationalists a cult’)”.
I think being inconsistent and contradicting yourself and not really knowing your own position on most topics is good, actually, as long as you keep working to craft better models of those topics and not just flail about randomly. Good models grow and merge from disparate fragments, and in the growing pains those fragments keep intermittently getting more and less viable. Waiting for them to settle makes you silent about the process, while talking it through is not a problem as long as the epistemic status of your discussion is clear.
Sticking to what you’ve said previously, simply because it’s something you happened to have said before, opposes lightness, stepping with the winds of evidence at the speed of their arrival (including logical evidence from your own better understandings as they settle). Explicitly noting the changes to your point of view, either declaring them publicly or even taking the time to note them privately for yourself, can make this slightly inconvenient, and that can have significant impact on ability to actually make progress, for the numerous tiny things that are not explicitly seen as some important project that ought to be taken seriously and given the effort. There is little downside to this, as far as I can tell, except for the norms with some influence that resist this kind of behavior, and if given leave can hold influence even inside one’s own mind.
I totally agree that being able to change your mind is good, and that the important thing is that you end up in the right place. (Although I think that your caveat “as long as you keep working to craft better models of those topics and not just flail about randomly” is doing a lot of work in the thesis you express; and it seems to me that this caveat has some requirements that make most of the specifics of what you endorse here… improbable. That is: if you don’t track your own path, then it’s much more likely that you’re flailing, and much more likely that you will flail; so what you gain in “lightness”, you lose in directionality. Indeed, I think you likely lose more than you gain.)
However.
If you change your mind about something, then it behooves you to not then behave as if your previous beliefs are bizarre, surprising, and explainable only as deliberate insults or other forms of bad faith.
Subjectively, it seems the things that are important to track are facts and observations, possibly specific sources (papers, videos), but not your own path or provisional positions expressed at various points along the way. So attention to detail, but not the detail of your own positions or externally expressed statements about them, that’s rarely of any value. You track the things encountered along your own path, so long as they remain relevant and not otherwise, but never the path itself.
If you change your mind about something, then it behooves you to not then behave as if your previous beliefs are bizarre, surprising, and explainable only as deliberate insults or other forms of bad faith.
That’s the broadly accepted norm. My point is that I think it’s a bad norm that damages effectiveness of lightness and does nothing useful.
Alright. Well, I guess we disagree here. I think the broadly accepted norm is good, and what you propose is bad (for the reason I describe in the grandparent comment).
Eh. What can I say, I think both things are true. Most of the time when I’ve heard someone claim that EA is a religion they are in fact trying to dunk on EA, and I wrote a post where I previously took a more expansive view of what should count as religion in a way that included EA. I don’t really endorse the expansive view now, and I also was not trying to claim EA is bad because it has features of religion.
I am still in fact curious what you meant since you didn’t and haven’t explained. That I may have some reason to think of EA as a religion doesn’t bear on the question of why you might think it is given we seem to disagree a lot.
I am still in fact curious what you meant since you didn’t and haven’t explained.
Gee, I wonder why? Could something have somehow prevented me from replying to your request for elaboration, in the linked comment thread? Can’t think what that might be, though…
Anyhow, the answer given in the grandparent is my actual answer. You were right the first time! Defining “religion” as “belief systems that involve supernatural claims”, or something along those lines, is commonplace but basically useless, because it results in a bunch of false positives (belief in ghosts, for instance) and, more importantly, a bunch of false negatives; we end up getting distracted by surface features, while missing the really important properties of religion, and failing to recognize other things which have those properties.
You’re not the first, but no less correct thereby, to make the observation that the core characteristics of religion, which make it have the practical properties that it has, mostly aren’t the supernatural beliefs, but rather stuff like “a belief system that makes sense of the world and one’s place in it, provides meaning, organizes one’s life via rituals, creates an ingroup, tells you how to live, tells you that you’ll be a good person if you do this-and-such”. (The part you were wrong about is the claim that these are good things.)
So Christianity is a religion, but so is Communism. Buddhism is a religion, but so is Progressivism. Judaism is a religion, but so is Effective Altruism (but I repeat myself).
And you’re also right that calling EA a religion, in this sense, is not exactly flattering. But it’s definitely not a content-free slur. It’s pointing out that a good chunk of what makes EA so attractive (especially to the sorts of people who are most attracted to it) is that they have a “religion-shaped hole” in their lives, but for various reasons, traditional religions do not suit them, while EA does—and ends up telling them how to live, providing their lives with meaning, giving them an ingroup, telling them that they’re good people if they do certain things, etc., etc. This indeed does not constitute “engagement with” EA on its own terms (as a set of positive and normative claims), yet that doesn’t actually make it false as an anthropological observation about EA as a memeplex / social movement / etc.
Thanks for writing this out. The details of your position far more charitable than the tone of your comments suggested it is (and than even the tone of this comment suggests it is!), and I basically agree that EA has a lot of religious-like features, which is why I (incorrectly!) thought of it as being like a religion in the past.
The reasons for thinking it’s not a religion now I discussed over in the linked thread, but to reiterate, it’s because EA lacks a rich relationship with sacredness (especially shared sacredness, even if some pockets of EA manage to hold some small number of things sacred) and is not high commitment in the ways that religions are (though it is high commitment in other ways). As I’ve spent more time understanding why it is I think religion as good, I’ve been able to get more clarity on just what features make it good, and also what features set religious-like groups apart from the ones we identify as central examples of religions.
On @Gordon Seidoh Worley’s his recent post, “Religion for Rationalists”, the following exchange took place:
Kabir Kumar:
Gordon Seidoh Worley:
Said Achmiz:
Gordon Seidoh Worley:
Said Achmiz:
Gordon Seidoh Worley:
But two years ago, Gordon wrote another post, called “Religion is Good, Actually”, where he wrote:
(Emphasis mine.)
So, my answer to Gordon, who was “honestly curious what [I] could mean here”, is “read your own post from two years ago (where you say exactly the thing that I just said and which you said was ‘a claim on par with people calling rationalists a cult’)”.
I think being inconsistent and contradicting yourself and not really knowing your own position on most topics is good, actually, as long as you keep working to craft better models of those topics and not just flail about randomly. Good models grow and merge from disparate fragments, and in the growing pains those fragments keep intermittently getting more and less viable. Waiting for them to settle makes you silent about the process, while talking it through is not a problem as long as the epistemic status of your discussion is clear.
Sticking to what you’ve said previously, simply because it’s something you happened to have said before, opposes lightness, stepping with the winds of evidence at the speed of their arrival (including logical evidence from your own better understandings as they settle). Explicitly noting the changes to your point of view, either declaring them publicly or even taking the time to note them privately for yourself, can make this slightly inconvenient, and that can have significant impact on ability to actually make progress, for the numerous tiny things that are not explicitly seen as some important project that ought to be taken seriously and given the effort. There is little downside to this, as far as I can tell, except for the norms with some influence that resist this kind of behavior, and if given leave can hold influence even inside one’s own mind.
I totally agree that being able to change your mind is good, and that the important thing is that you end up in the right place. (Although I think that your caveat “as long as you keep working to craft better models of those topics and not just flail about randomly” is doing a lot of work in the thesis you express; and it seems to me that this caveat has some requirements that make most of the specifics of what you endorse here… improbable. That is: if you don’t track your own path, then it’s much more likely that you’re flailing, and much more likely that you will flail; so what you gain in “lightness”, you lose in directionality. Indeed, I think you likely lose more than you gain.)
However.
If you change your mind about something, then it behooves you to not then behave as if your previous beliefs are bizarre, surprising, and explainable only as deliberate insults or other forms of bad faith.
Subjectively, it seems the things that are important to track are facts and observations, possibly specific sources (papers, videos), but not your own path or provisional positions expressed at various points along the way. So attention to detail, but not the detail of your own positions or externally expressed statements about them, that’s rarely of any value. You track the things encountered along your own path, so long as they remain relevant and not otherwise, but never the path itself.
That’s the broadly accepted norm. My point is that I think it’s a bad norm that damages effectiveness of lightness and does nothing useful.
Alright. Well, I guess we disagree here. I think the broadly accepted norm is good, and what you propose is bad (for the reason I describe in the grandparent comment).
Eh. What can I say, I think both things are true. Most of the time when I’ve heard someone claim that EA is a religion they are in fact trying to dunk on EA, and I wrote a post where I previously took a more expansive view of what should count as religion in a way that included EA. I don’t really endorse the expansive view now, and I also was not trying to claim EA is bad because it has features of religion.
I am still in fact curious what you meant since you didn’t and haven’t explained. That I may have some reason to think of EA as a religion doesn’t bear on the question of why you might think it is given we seem to disagree a lot.
Gee, I wonder why? Could something have somehow prevented me from replying to your request for elaboration, in the linked comment thread? Can’t think what that might be, though…
Anyhow, the answer given in the grandparent is my actual answer. You were right the first time! Defining “religion” as “belief systems that involve supernatural claims”, or something along those lines, is commonplace but basically useless, because it results in a bunch of false positives (belief in ghosts, for instance) and, more importantly, a bunch of false negatives; we end up getting distracted by surface features, while missing the really important properties of religion, and failing to recognize other things which have those properties.
You’re not the first, but no less correct thereby, to make the observation that the core characteristics of religion, which make it have the practical properties that it has, mostly aren’t the supernatural beliefs, but rather stuff like “a belief system that makes sense of the world and one’s place in it, provides meaning, organizes one’s life via rituals, creates an ingroup, tells you how to live, tells you that you’ll be a good person if you do this-and-such”. (The part you were wrong about is the claim that these are good things.)
So Christianity is a religion, but so is Communism. Buddhism is a religion, but so is Progressivism. Judaism is a religion, but so is Effective Altruism (but I repeat myself).
And you’re also right that calling EA a religion, in this sense, is not exactly flattering. But it’s definitely not a content-free slur. It’s pointing out that a good chunk of what makes EA so attractive (especially to the sorts of people who are most attracted to it) is that they have a “religion-shaped hole” in their lives, but for various reasons, traditional religions do not suit them, while EA does—and ends up telling them how to live, providing their lives with meaning, giving them an ingroup, telling them that they’re good people if they do certain things, etc., etc. This indeed does not constitute “engagement with” EA on its own terms (as a set of positive and normative claims), yet that doesn’t actually make it false as an anthropological observation about EA as a memeplex / social movement / etc.
Thanks for writing this out. The details of your position far more charitable than the tone of your comments suggested it is (and than even the tone of this comment suggests it is!), and I basically agree that EA has a lot of religious-like features, which is why I (incorrectly!) thought of it as being like a religion in the past.
The reasons for thinking it’s not a religion now I discussed over in the linked thread, but to reiterate, it’s because EA lacks a rich relationship with sacredness (especially shared sacredness, even if some pockets of EA manage to hold some small number of things sacred) and is not high commitment in the ways that religions are (though it is high commitment in other ways). As I’ve spent more time understanding why it is I think religion as good, I’ve been able to get more clarity on just what features make it good, and also what features set religious-like groups apart from the ones we identify as central examples of religions.