The hypotheses listed mostly focus on the internal aspects of CFAR.
This may be somewhat misleading to a naive reader. (I am speaking mainly to this hypothetical naive reader, not to Anna, who is non-naive.)
What CFAR was trying to do was extremely ambitious, and it was very likely going to ‘fail’ in some way. It’s good FOR CFAR to consider what the org could improve on (which is where its leverage is), but for a big picture view of it, you should also think about the overall landscape and circumstances surrounding CFAR. And some of this was probably not obvious at the outset (at the beginning of its existence), and so CFAR may have had to discover where certain major roadblocks were, as they tried to drive forward. This post doesn’t seem to touch on those roadblocks in particular, maybe because they’re not as interesting as considering the potential leverage points.
But if you’re going to be realistic about this and want the big-picture sense, you should consider the following:
OK, so CFAR’s mission under Pete’s leadership was to find/train people who could be effective responders to x-risk, particularly AI risk.
There is the possibility that most of the relevant ‘action’ on CFAR’s part is on ‘finding’ the right people, with the right starting ingredients, whatever those may be. But maybe there just weren’t that many good starting ingredients to be found. That limiting factor, if indeed it was a limiting factor, would have hampered CFAR’s ability to succeed in its mission.
Hard problems around this whole thing also include: How do you know what the right starting ingredients even are? What do these ‘right people’ even look like? Are they going to be very similar to each other or very different? How much is the training supposed to be customized for the individual? What parts of the curriculum should be standardized?
Additional possibility: Maybe the CFAR training wouldn’t bear edible fruit for another ten years after that person’s initial exposure to CFAR? (And like, I’m leaning on this being somewhat true?) If this is the case, you’re just stuck with slow feedback loops. (Additionally, consider the possibility that people who seem to be progressing ‘quickly’ might be doing this in a misleading way or your criteria for judging are quite wrong, causing you to make changes to your training that lead you astray.)
Less hard problem but adds complexity: How do you deal with the fact that people in this culture, esp rationalists?, get all sensitive around being evaluated? You need to evaluate people, in the end, because you don’t have the ability to train everyone who wants it, and not everyone is ready or worth the investment. But then people tend to get all fidgety and triggered when you start putting them in different buckets, especially when you’re in a culture that believes strongly in individualism (“I am special, I have something to offer”) and equality (“Things should be fair, everyone should have the same opportunities.”). And also you’re working with people who were socialized from a young age to identify with their own intelligence as a major part of their self-worth, and then they come into your community, feeling like they’ve finally found their people, only to be told: “Sorry you’re not actually cut out for this work. It’s not about you.”
Also:
The egregores that are dominating mainstream culture and the global world situation are not just sitting passively around while people try to train themselves to break free of their deeply ingrained patterns of mind. I think people don’t appreciate just how hard it is to uninstall the malware most of us are born with / educated into (and which block people from original thinking). These egregores have been functioning for hundreds of years. Is the ground fertile for the art of rationality? My sense is that the ground is dry and salted, and yet we still make attempts to grow the art out of that soil.
IMO the same effects that have led us to current human-created global crises are the same ones that make it difficult to train people in rationality. So, ya’ll are up against a strong and powerful foe.
Honestly my sense is that CFAR was significantly crippled by one or more of these egregores (partially due to its own cowardice). But that’s a longer conversation, and I’m not going to have it out here.
//
All of this is just to give a taste of how difficult the original problems were that CFAR was trying to resolve. We’re not in a world that’s like, “Oh yeah, with your hearts and minds in the right place, you’ll make it through!” Or even “If you just have the best thoughts compared to all the other people, you’ll win!” Or even “If you have the best thoughts, a slick and effective team, lots of money, and a lot of personal agency and ability, you’ll definitely find the answers you seek.”
And so the list of hypotheses + analyses above may make it sound like if CFAR had its shit more ‘together’, it would have done a better job. Maybe? How much better though? Realistically?
As we move forward on this wild journey, it just seems to become clearer how hard this whole situation really is. The more collective clarity we have on the “actual ground-level situation” (versus internal ideas, hopes, wishes, and fears coloring our perspective of reality), … honestly the more confronting it all is. The more existentially horrifying. And just touching THAT is hard (impossible?) for most people.
(Which is partially why I’m training at a place like MAPLE. I seem to be saner now about x-risk. And I get that we’re rapidly running out of time without feeling anxious about that fact and without needing to reframe it in a more hopeful way. I don’t have much need for hope, it seems. And it doesn’t stop me from wanting to help.)
The egregores that are dominating mainstream culture and the global world situation are not just sitting passively around while people try to train themselves to break free of their deeply ingrained patterns of mind. I think people don’t appreciate just how hard it is to uninstall the malware most of us are born with / educated into (and which block people from original thinking). These egregores have been functioning for hundreds of years. Is the ground fertile for the art of rationality? My sense is that the ground is dry and salted, and yet we still make attempts to grow the art out of that soil.
IMO the same effects that have led us to current human-created global crises are the same ones that make it difficult to train people in rationality. So, ya’ll are up against a strong and powerful foe.
Honestly my sense is that CFAR was significantly crippled by one or more of these egregores (partially due to its own cowardice).
Yes; I agree with this. And it seems big. I wish I knew more legible, obviously-real concepts for trying to get at this.
I probably don’t have the kinds of concepts you’re interested in, but…
Some significant conceptual pieces in my opinion are:
“As above, so below.” Everything that happens in the world can be seen as a direct, fractal-like reflection of ‘the mind’ that is operating (both individual and collective). Basically, things like ‘colonialism’ and ‘fascism’ and all that are external representations of the internal. (So, when some organization is having ‘a crisis’ of some kind, this is like the Shakespeare play happening on stage… playing out something that’s going on internal to the org, both at the group level and the individual level.) Egregores, therefore, are also linked inextricably to ‘the mind’, broadly construed. They’re ‘emergent’ and not ‘fixed’. (So whatever this ‘rationality’ thing is, could be important in a fundamental way, if it changes ‘the mind’.) Circling makes this tangible on a small scale.
My teacher gave a talk on “AI” where he lists four kinds of processes (or algorithms, you could say) that all fit onto a spectrum. Artificial Intelligence > Culture > Emotions / Thoughts > Sense perception. Each of these ‘algorithms’ have ‘agendas’ or ‘functions’. And these functions are not necessarily in service of truth. (‘Sense perception’ clearly evolved from natural selection, which is keyed into survival and reproduction. Not truth-seeking aims. In other words, it’s ‘not aligned’.) Humans ‘buy in’ to these algorithms and deeply believe they’re serving our betterment, but ‘fitness’ (ability to survive and reproduce) is not necessarily the result of ‘more truth-aligned’ or ‘goodness aligned’. So … a deeper investigation may be needed to discern what’s trustworthy. Why do we believe what we believe? Why do we believe the results of AI processes… and then why do we believe in our cultural ideologies? And why do I buy into my thoughts and feelings? Being able to see the nature of all four of these processes and seeing how they’re the same phenomena on different scales / using different mediums is useful.
Different people have different ‘roles’ with respect to the egregores. The obvious role I see is something like ‘fundamentalist priest’? Rationality has ‘fundamentalist priests’ too. They use their religion as a tool for controlling others. “Wow you don’t believe X? You must be stupid or insane.” To be more charitable though, some people just ‘want to move on’ from debating things that they’ve already ‘resolved’ as ‘true’. And so they reify certain doctrines as ‘true doctrine’ and then create platforms, organizations, and institutions where those doctrines are ‘established truth’. From THERE, it becomes much easier to coordinate. And coordination is power. By aligning groups using doctrines, these groups ‘get a lot done’. “Getting a lot done” here includes taking stuff over… ideological conquest, among other forms of conquest. This is the pattern that has played out for thousands of years. We have not broken free of this at all, and rationality (maybe moreso EA) has played right into this. And now there’s a lot of incentive to maintain and prop up these ‘doctrines’ because a lot has been built on top of them.
Why do humans keep getting captured? Well we’re really easy to manipulate. I think the Sequences covers a lot of this… but also, things like ‘fear of death, illness, and loss of livelihood’ is a pretty reliable thing humans fall prey to. They readily give away their power when faced with these fears. See: COVID-19.
Because we are afraid of various forms of loss, we desperately build and maintain castles on top of propped up, false doctrines… so yeah, we’re scheduling our own collapse. That shit is not gonna hold. Everything we see happening in this world, we ourselves created the conditions for.
It’s originally an occult term, but my more-materialistic definition of it is “something that acts like an entity with motivations that is considerably bigger than a human and is generally run in a ‘distributed computing’ fashion across many individual minds.” Microsoft the company is an egregore; feminism the social movement is an egregore; America the country is an egregore. The program “Minecraft” is not an egregore, an individual deer is not an egregore, a river is not an egregore.
Unreal’s point is that these things ‘fight back’ and act on their distributed perception; if your corner of the world comes to believe that academia is a wasteful trap, for example, “academia” will notice and label you various things, which will then cause pro-academia people to avoid you and anti-academia people to start treating you as a political ally, both of which can make you worse off / twisted away from your original purpose.
It’s possible to create an organization in a technical sense that isn’t an egregore though. Lots of people have tried to create secular churches, for instance, but they mostly just fall flat because they’re not a viable design to create a living distributed entity.
Some parties (as in, a group of people at some gathering) fail to congeal into an egregore. But when they do, the scene “clicks”. And sometimes those spawn egregores that outlast the party — but not often.
Thanks for weighing in; I trust these conversations a lot more when they have multiple people from current or former CFAR. (For anyone not tracking, Unreal worked at CFAR for awhile.) (And, sorry, I know you said you’re mainly writing this to not-me, but I want to engage anyhow.)
The hypotheses listed mostly focus on the internal aspects of CFAR.
This may be somewhat misleading to a naive reader. (I am speaking mainly to this hypothetical naive reader, not to Anna, who is non-naive.)
.… It’s good FOR CFAR to consider what the org could improve on (which is where its leverage is), but for a big picture view of it, you should also think about the overall landscape and circumstances surrounding CFAR. And some of this was probably not obvious at the outset (at the beginning of its existence), and so CFAR may have had to discover where certain major roadblocks were, as they tried to drive forward. This post doesn’t seem to touch on those roadblocks in particular, maybe because they’re not as interesting as considering the potential leverage points.
Re: the above: I was actually trying to focus on not-specific-to-us-as-individuals factors that made the problem hard, or that made particular failure modes easy to fall into. I am hoping this post and its comments might be of use to both future-CFAR (e.g., future-me), and anyone aiming to build an “art of rationality” via some other group/place/effort.
So, if you skim over my hypotheses in the side-panel, they are things like “it’s difficult to distinguish effective and ineffective interventions” and “in practice, many/most domains incentivize social manipulation rather than rationality.” (Not things like “such-and-such an individual had such-and-such an unusual individual weakness.”)
That is, I’m trying to understand and describe the background conditions that, IMO, gradually pulled CFAR and its members toward kinds of activity that had less of a shot at creating a real art of rationality. (My examples do involve us-in-particular, but that’s because that’s where the data is; that’s what we know that others may want to know, when trying to build out an accurate picture of what paths have a shot at getting to a real art of rationality.)
I think we’re maybe tackling the same puzzle, then (the puzzle of “how can a group take a good shot at building an art of rationality / what major obstacles are in the way / what is a person likely to miss in their first attempt, that might be nice to instead know about?”). And we’re simply arriving at different guesses about the answers to that puzzle?
I think a careful and non-naive reading of your post would avoid the issues I was trying to address.
But I think a naive reading of your post might come across as something like, “Oh CFAR was just not that good at stuff I guess” / “These issues seem easy to resolve.”
So I felt it was important to acknowledge the magnitude of the ambition of CFAR and that such projects are actually quite difficult to pull off, especially in the post-modern information age.
//
I wish I could say I was speaking from an interest in tackling the puzzle. I’m not coming from there.
So if we look at the egregore as having a flavor of agency and intention… this egregore demands constant extraction of resources from the earth. It demands people want things it doesn’t need (consumer culture). It disempowers or destroys anything that manages to avoid it or escape it (e.g. self-sufficient villages, cultures that don’t participate) - there’s an extinction of hunter-gatherer lifestyles going on; there’s legally mandated taking of children from villages in order to indoctrinate them into civilization (in Malaysia anyway; China is doing a ‘nicer’ version). There’s energy-company goons that go into rainforests and chase out tribes from their homes in order to take their land. This egregore does not care about life or the planet.
You are welcome to disagree of course, this is just one perspective.
I dunno what to call this one, but it’s got Marxist roots
There’s an egregore that feeds off class division. So right now, there’s a bunch of these going on at once. The following are ‘crudely defined’ and I don’t mean them super literally, but just trying to point at some of the dividing lines, as examples: Feminists vs privileged white men. Poor blacks vs white cops. The 99% vs the 1%. Rural vs urban. This egregore wants everyone to feel persecuted. All these different class divisions feed into the same egregore.
Do the rationalists feel persecuted / victimized? Oh yeah. Like, not literally all of them, but I’d say a significant chunk of them. Maybe most of them. So they haven’t successfully seen through this one.
power-granting religion, broadly construed
Christianity is historically the main example of a religious egregore. But a newer contender is ‘scientism’. Scientism is not the true art of science and doesn’t resemble it at all. Scientism has ordained priests that have special access to journals (knowledge) and special privileges that give them the ability to publish in those esoteric texts. Governments, corporations, and the egregores mentioned above want control over these priests. Sometimes buying their own.
Obviously this egregore doesn’t benefit from ordinary people having critical thinking skills and the ability to evaluate the truth for themselves. It dissuades people from trying by creating high barriers to entry and making its texts hard or time-consuming to comprehend. It gets away with a lot of shit by having a strong brand. The integrity behind that brand has significantly degraded, over the decades.
These three egregores benefit from people feeling powerless, worthless, or apathetic (malware). Basically the opposite of heroic, worthy, and compassionate (liberated, loving sovereignty). Helping to start uninstalling the malware is, like, one of the things CFAR has to do in order to even start having conversations about AI with most people.
And, unfortunately… like… often, buying into one of these egregores (usually this would be unconsciously done) actually makes a person more effective. Sometimes quite ‘successful’ according to the egregore’s standards (rich, powerful, well-respected, etc). The egregores know how to churn out ‘effective’ people. But these people are ‘effective’ in service to the egregore. They’re not necessarily effective outside of that context.
So, any sincere and earnest movement has to contend with this eternal temptation:
Do we sell out? By how much?
The egregore tempts you with its multitude of resources. To some extent, I think you have to engage. Since you’re trying to ultimately change the direction of history, right?
This egregore wants everyone to feel persecuted. … Do the rationalists feel persecuted / victimized? Oh yeah. … So they haven’t successfully seen through this one.
Note that this doesn’t follow. It might be, for example, that the egregore causes (some) people to feel persecuted by causing them to be persecuted.
(Admittedly I’m not sure I know what it means to “see through” an egregore. Like, if you “see through” capitalism, you… recognize capitalism as an egregore that demands etc.? You recognize that there may be other ways to organize a society, though you may or may not think any of those other ways are preferable all things considered? You want fewer things that you don’t need?
But presumably, “seeing through” it doesn’t extract you from your capitalist society, if you live in one; you still need a job to get money, and you still need money to purchase goods and services, and so on. And if you don’t live in a capitalist society but a capitalist society is coming to take your land and separate you from your children, “seeing through” capitalism doesn’t protect you from that either.
And so presumably, “seeing through” an egregore that wants you to feel persecuted, doesn’t make you not-persecuted. It might make you not-feel-persecuted if you’re in-fact not persecuted.)
I dunno if I was clear enough here about what it means to feel persecuted.
So the way I’m using that phrase, ‘feeling persecuted’ is not desirable whether you are actually being persecuted or not.
‘Feeling persecuted’ means feeling helpless, powerless, or otherwise victimized. Feeling like the universe is against you or your tribe, and that things are (in some sense) inherently bad and may forever be bad, and that nothing can be done.
If, indeed, you are part of a group that has fewer rights and privileges than the dominant groups, you can acknowledge to yourself “my people don’t have the same rights as other people” but you don’t have to feel any sense of persecution around that. You can just see that it is true and happening, without feeling helpless and like something is inherently broken or that you are inherently broken.
Seeing through the egregore would help a person realize that ‘oh there is an egregore feeding on my beliefs about being persecuted but it’s not actually a fundamental truth about the world; things can actually be different; and I’m not defined by my victimhood. maybe i should stop feeding this egregore with these thoughts and feelings that don’t actually help anything or anyone and isn’t really an accurate representation of reality anyway.’
So I don’t really want to get into this, my note was about the structure of the argument rather than factual claims about the world. But...
I think I feel motte-and-baileyed? When I read your original comment with the term “feel persecuted” I’m like “eh, dunno, sounds plausible I guess?”. When I read it trying to substitute in the definition you give I’m like ”...mm, skeptical”.
Like I get that jargon sometimes just has that effect, I’m not currently saying you shouldn’t use that term with that meaning. But that’s my reaction.
(If you do want a different hook to use, it sounds like “feel persecuted, and also be clinically depressed” is tongue-in-cheek kinda close to what you describe? Though bringing in the concept of “depression”, and especially “clinical” depression, may not help see things clearly either.)
No, it’s definitely not about being depressed. That’s very far from it. But I also don’t want to argue about the claims here. Seems maybe beside the point.
I think I could reword my original argument in a way that wouldn’t be a problem. I just wasn’t careful in my languaging, but I personally think it’s fine? I think you might be reading a lot into my usage of the word “So”.
Scientism has ordained priests that have special access to journals (knowledge) and special privileges that give them the ability to publish in those esoteric texts.
The Ivermectin case seemed that journals are not important to Scientism. Nobody cared about peer-reviewed meta-analysis when those went counter to institutional positions.
How do you deal with the fact that people in this culture, esp rationalists?, get all sensitive around being evaluated? You need to evaluate people, in the end, because you don’t have the ability to train everyone who wants it, and not everyone is ready or worth the investment. But then people tend to get all fidgety and triggered when you start putting them in different buckets
Uhm, some kind of Comfort Zone Expansion? Make people participate in competitions where they will predictably lose; and then they realize that life goes on.
Also, some kind of: “you should sincerely hope that you are not the smartest person on this planet, because if the smartest person on this planet is you, then frankly we are all going to die (and yes, this includes you)… but if there are many people smarter than you, then perhaps we might survive the AI” perspective.
If everyone is special, no one is. But in reality, some people are special-special, and most people are just ordinary-special. Statistically, you most likely belong to the latter group.
But the good news is that you have something to offer even if you are not special! Some things need to be done repeatedly, or at multiple places (such as organizing a local LessWrong meetup). Some things are important but not the most important, which is why the special-special people do not have enough time to do them, so it’s up to you.
Be honest and admit that it’s not about what you can offer, but what status you hope to get in return.
“Things should be fair, everyone should have the same opportunities.”
you’re working with people who were socialized from a young age to identify with their own intelligence as a major part of their self-worth, and then they come into your community, feeling like they’ve finally found their people, only to be told: “Sorry you’re not actually cut out for this work. It’s not about you.”
What a hypocrisy! If only highly intelligent people are worthy, then most humans never got a real opportunity. But those don’t matter, I suppose; only the members of the intellectual elite should have the same opportunities. You are a member of the elite, but are not a member of the elite-within-elite. Congratulations, now you can better empathize with the intellectual 98%.
People who are not fit for the elite work can still be welcome in the rationalist community.
The egregores that are dominating mainstream culture and the global world situation are not just sitting passively around while people try to train themselves to break free of their deeply ingrained patterns of mind.
Wait, don’t generalize so quickly. Perhaps being in Bay Area is playing on hard mode, from this perspective. Move to a place where people are… more emotionally capable of being told they are not the planet’s #1.
The hypotheses listed mostly focus on the internal aspects of CFAR.
This may be somewhat misleading to a naive reader. (I am speaking mainly to this hypothetical naive reader, not to Anna, who is non-naive.)
What CFAR was trying to do was extremely ambitious, and it was very likely going to ‘fail’ in some way. It’s good FOR CFAR to consider what the org could improve on (which is where its leverage is), but for a big picture view of it, you should also think about the overall landscape and circumstances surrounding CFAR. And some of this was probably not obvious at the outset (at the beginning of its existence), and so CFAR may have had to discover where certain major roadblocks were, as they tried to drive forward. This post doesn’t seem to touch on those roadblocks in particular, maybe because they’re not as interesting as considering the potential leverage points.
But if you’re going to be realistic about this and want the big-picture sense, you should consider the following:
OK, so CFAR’s mission under Pete’s leadership was to find/train people who could be effective responders to x-risk, particularly AI risk.
There is the possibility that most of the relevant ‘action’ on CFAR’s part is on ‘finding’ the right people, with the right starting ingredients, whatever those may be. But maybe there just weren’t that many good starting ingredients to be found. That limiting factor, if indeed it was a limiting factor, would have hampered CFAR’s ability to succeed in its mission.
Hard problems around this whole thing also include: How do you know what the right starting ingredients even are? What do these ‘right people’ even look like? Are they going to be very similar to each other or very different? How much is the training supposed to be customized for the individual? What parts of the curriculum should be standardized?
Additional possibility: Maybe the CFAR training wouldn’t bear edible fruit for another ten years after that person’s initial exposure to CFAR? (And like, I’m leaning on this being somewhat true?) If this is the case, you’re just stuck with slow feedback loops. (Additionally, consider the possibility that people who seem to be progressing ‘quickly’ might be doing this in a misleading way or your criteria for judging are quite wrong, causing you to make changes to your training that lead you astray.)
Less hard problem but adds complexity: How do you deal with the fact that people in this culture, esp rationalists?, get all sensitive around being evaluated? You need to evaluate people, in the end, because you don’t have the ability to train everyone who wants it, and not everyone is ready or worth the investment. But then people tend to get all fidgety and triggered when you start putting them in different buckets, especially when you’re in a culture that believes strongly in individualism (“I am special, I have something to offer”) and equality (“Things should be fair, everyone should have the same opportunities.”). And also you’re working with people who were socialized from a young age to identify with their own intelligence as a major part of their self-worth, and then they come into your community, feeling like they’ve finally found their people, only to be told: “Sorry you’re not actually cut out for this work. It’s not about you.”
Also:
The egregores that are dominating mainstream culture and the global world situation are not just sitting passively around while people try to train themselves to break free of their deeply ingrained patterns of mind. I think people don’t appreciate just how hard it is to uninstall the malware most of us are born with / educated into (and which block people from original thinking). These egregores have been functioning for hundreds of years. Is the ground fertile for the art of rationality? My sense is that the ground is dry and salted, and yet we still make attempts to grow the art out of that soil.
IMO the same effects that have led us to current human-created global crises are the same ones that make it difficult to train people in rationality. So, ya’ll are up against a strong and powerful foe.
Honestly my sense is that CFAR was significantly crippled by one or more of these egregores (partially due to its own cowardice). But that’s a longer conversation, and I’m not going to have it out here.
//
All of this is just to give a taste of how difficult the original problems were that CFAR was trying to resolve. We’re not in a world that’s like, “Oh yeah, with your hearts and minds in the right place, you’ll make it through!” Or even “If you just have the best thoughts compared to all the other people, you’ll win!” Or even “If you have the best thoughts, a slick and effective team, lots of money, and a lot of personal agency and ability, you’ll definitely find the answers you seek.”
And so the list of hypotheses + analyses above may make it sound like if CFAR had its shit more ‘together’, it would have done a better job. Maybe? How much better though? Realistically?
As we move forward on this wild journey, it just seems to become clearer how hard this whole situation really is. The more collective clarity we have on the “actual ground-level situation” (versus internal ideas, hopes, wishes, and fears coloring our perspective of reality), … honestly the more confronting it all is. The more existentially horrifying. And just touching THAT is hard (impossible?) for most people.
(Which is partially why I’m training at a place like MAPLE. I seem to be saner now about x-risk. And I get that we’re rapidly running out of time without feeling anxious about that fact and without needing to reframe it in a more hopeful way. I don’t have much need for hope, it seems. And it doesn’t stop me from wanting to help.)
Yes; I agree with this. And it seems big. I wish I knew more legible, obviously-real concepts for trying to get at this.
I probably don’t have the kinds of concepts you’re interested in, but…
Some significant conceptual pieces in my opinion are:
“As above, so below.” Everything that happens in the world can be seen as a direct, fractal-like reflection of ‘the mind’ that is operating (both individual and collective). Basically, things like ‘colonialism’ and ‘fascism’ and all that are external representations of the internal. (So, when some organization is having ‘a crisis’ of some kind, this is like the Shakespeare play happening on stage… playing out something that’s going on internal to the org, both at the group level and the individual level.) Egregores, therefore, are also linked inextricably to ‘the mind’, broadly construed. They’re ‘emergent’ and not ‘fixed’. (So whatever this ‘rationality’ thing is, could be important in a fundamental way, if it changes ‘the mind’.) Circling makes this tangible on a small scale.
My teacher gave a talk on “AI” where he lists four kinds of processes (or algorithms, you could say) that all fit onto a spectrum. Artificial Intelligence > Culture > Emotions / Thoughts > Sense perception. Each of these ‘algorithms’ have ‘agendas’ or ‘functions’. And these functions are not necessarily in service of truth. (‘Sense perception’ clearly evolved from natural selection, which is keyed into survival and reproduction. Not truth-seeking aims. In other words, it’s ‘not aligned’.) Humans ‘buy in’ to these algorithms and deeply believe they’re serving our betterment, but ‘fitness’ (ability to survive and reproduce) is not necessarily the result of ‘more truth-aligned’ or ‘goodness aligned’. So … a deeper investigation may be needed to discern what’s trustworthy. Why do we believe what we believe? Why do we believe the results of AI processes… and then why do we believe in our cultural ideologies? And why do I buy into my thoughts and feelings? Being able to see the nature of all four of these processes and seeing how they’re the same phenomena on different scales / using different mediums is useful.
Different people have different ‘roles’ with respect to the egregores. The obvious role I see is something like ‘fundamentalist priest’? Rationality has ‘fundamentalist priests’ too. They use their religion as a tool for controlling others. “Wow you don’t believe X? You must be stupid or insane.” To be more charitable though, some people just ‘want to move on’ from debating things that they’ve already ‘resolved’ as ‘true’. And so they reify certain doctrines as ‘true doctrine’ and then create platforms, organizations, and institutions where those doctrines are ‘established truth’. From THERE, it becomes much easier to coordinate. And coordination is power. By aligning groups using doctrines, these groups ‘get a lot done’. “Getting a lot done” here includes taking stuff over… ideological conquest, among other forms of conquest. This is the pattern that has played out for thousands of years. We have not broken free of this at all, and rationality (maybe moreso EA) has played right into this. And now there’s a lot of incentive to maintain and prop up these ‘doctrines’ because a lot has been built on top of them.
Why do humans keep getting captured? Well we’re really easy to manipulate. I think the Sequences covers a lot of this… but also, things like ‘fear of death, illness, and loss of livelihood’ is a pretty reliable thing humans fall prey to. They readily give away their power when faced with these fears. See: COVID-19.
Because we are afraid of various forms of loss, we desperately build and maintain castles on top of propped up, false doctrines… so yeah, we’re scheduling our own collapse. That shit is not gonna hold. Everything we see happening in this world, we ourselves created the conditions for.
What exactly is an egregore?
It’s originally an occult term, but my more-materialistic definition of it is “something that acts like an entity with motivations that is considerably bigger than a human and is generally run in a ‘distributed computing’ fashion across many individual minds.” Microsoft the company is an egregore; feminism the social movement is an egregore; America the country is an egregore. The program “Minecraft” is not an egregore, an individual deer is not an egregore, a river is not an egregore.
Unreal’s point is that these things ‘fight back’ and act on their distributed perception; if your corner of the world comes to believe that academia is a wasteful trap, for example, “academia” will notice and label you various things, which will then cause pro-academia people to avoid you and anti-academia people to start treating you as a political ally, both of which can make you worse off / twisted away from your original purpose.
Is it fair to say that organizations, movements, polities, and communities are all egregores?
Pretty much, yes.
It’s possible to create an organization in a technical sense that isn’t an egregore though. Lots of people have tried to create secular churches, for instance, but they mostly just fall flat because they’re not a viable design to create a living distributed entity.
Some parties (as in, a group of people at some gathering) fail to congeal into an egregore. But when they do, the scene “clicks”. And sometimes those spawn egregores that outlast the party — but not often.
So, it’s a little complicated.
But to a first approximation, yes.
Thanks for weighing in; I trust these conversations a lot more when they have multiple people from current or former CFAR. (For anyone not tracking, Unreal worked at CFAR for awhile.) (And, sorry, I know you said you’re mainly writing this to not-me, but I want to engage anyhow.)
Re: the above: I was actually trying to focus on not-specific-to-us-as-individuals factors that made the problem hard, or that made particular failure modes easy to fall into. I am hoping this post and its comments might be of use to both future-CFAR (e.g., future-me), and anyone aiming to build an “art of rationality” via some other group/place/effort.
So, if you skim over my hypotheses in the side-panel, they are things like “it’s difficult to distinguish effective and ineffective interventions” and “in practice, many/most domains incentivize social manipulation rather than rationality.” (Not things like “such-and-such an individual had such-and-such an unusual individual weakness.”)
That is, I’m trying to understand and describe the background conditions that, IMO, gradually pulled CFAR and its members toward kinds of activity that had less of a shot at creating a real art of rationality. (My examples do involve us-in-particular, but that’s because that’s where the data is; that’s what we know that others may want to know, when trying to build out an accurate picture of what paths have a shot at getting to a real art of rationality.)
I think we’re maybe tackling the same puzzle, then (the puzzle of “how can a group take a good shot at building an art of rationality / what major obstacles are in the way / what is a person likely to miss in their first attempt, that might be nice to instead know about?”). And we’re simply arriving at different guesses about the answers to that puzzle?
Right.
I think a careful and non-naive reading of your post would avoid the issues I was trying to address.
But I think a naive reading of your post might come across as something like, “Oh CFAR was just not that good at stuff I guess” / “These issues seem easy to resolve.”
So I felt it was important to acknowledge the magnitude of the ambition of CFAR and that such projects are actually quite difficult to pull off, especially in the post-modern information age.
//
I wish I could say I was speaking from an interest in tackling the puzzle. I’m not coming from there.
Could you clarify what egregores you meant when you said:
The main ones are:
modern capitalism / the global economy
So if we look at the egregore as having a flavor of agency and intention… this egregore demands constant extraction of resources from the earth. It demands people want things it doesn’t need (consumer culture). It disempowers or destroys anything that manages to avoid it or escape it (e.g. self-sufficient villages, cultures that don’t participate) - there’s an extinction of hunter-gatherer lifestyles going on; there’s legally mandated taking of children from villages in order to indoctrinate them into civilization (in Malaysia anyway; China is doing a ‘nicer’ version). There’s energy-company goons that go into rainforests and chase out tribes from their homes in order to take their land. This egregore does not care about life or the planet.
You are welcome to disagree of course, this is just one perspective.
I dunno what to call this one, but it’s got Marxist roots
There’s an egregore that feeds off class division. So right now, there’s a bunch of these going on at once. The following are ‘crudely defined’ and I don’t mean them super literally, but just trying to point at some of the dividing lines, as examples: Feminists vs privileged white men. Poor blacks vs white cops. The 99% vs the 1%. Rural vs urban. This egregore wants everyone to feel persecuted. All these different class divisions feed into the same egregore.
Do the rationalists feel persecuted / victimized? Oh yeah. Like, not literally all of them, but I’d say a significant chunk of them. Maybe most of them. So they haven’t successfully seen through this one.
power-granting religion, broadly construed
Christianity is historically the main example of a religious egregore. But a newer contender is ‘scientism’. Scientism is not the true art of science and doesn’t resemble it at all. Scientism has ordained priests that have special access to journals (knowledge) and special privileges that give them the ability to publish in those esoteric texts. Governments, corporations, and the egregores mentioned above want control over these priests. Sometimes buying their own.
Obviously this egregore doesn’t benefit from ordinary people having critical thinking skills and the ability to evaluate the truth for themselves. It dissuades people from trying by creating high barriers to entry and making its texts hard or time-consuming to comprehend. It gets away with a lot of shit by having a strong brand. The integrity behind that brand has significantly degraded, over the decades.
These three egregores benefit from people feeling powerless, worthless, or apathetic (malware). Basically the opposite of heroic, worthy, and compassionate (liberated, loving sovereignty). Helping to start uninstalling the malware is, like, one of the things CFAR has to do in order to even start having conversations about AI with most people.
And, unfortunately… like… often, buying into one of these egregores (usually this would be unconsciously done) actually makes a person more effective. Sometimes quite ‘successful’ according to the egregore’s standards (rich, powerful, well-respected, etc). The egregores know how to churn out ‘effective’ people. But these people are ‘effective’ in service to the egregore. They’re not necessarily effective outside of that context.
So, any sincere and earnest movement has to contend with this eternal temptation:
Do we sell out? By how much?
The egregore tempts you with its multitude of resources. To some extent, I think you have to engage. Since you’re trying to ultimately change the direction of history, right?
Still, ahhh, tough. Tough call. Tricky.
Note that this doesn’t follow. It might be, for example, that the egregore causes (some) people to feel persecuted by causing them to be persecuted.
(Admittedly I’m not sure I know what it means to “see through” an egregore. Like, if you “see through” capitalism, you… recognize capitalism as an egregore that demands etc.? You recognize that there may be other ways to organize a society, though you may or may not think any of those other ways are preferable all things considered? You want fewer things that you don’t need?
But presumably, “seeing through” it doesn’t extract you from your capitalist society, if you live in one; you still need a job to get money, and you still need money to purchase goods and services, and so on. And if you don’t live in a capitalist society but a capitalist society is coming to take your land and separate you from your children, “seeing through” capitalism doesn’t protect you from that either.
And so presumably, “seeing through” an egregore that wants you to feel persecuted, doesn’t make you not-persecuted. It might make you not-feel-persecuted if you’re in-fact not persecuted.)
I dunno if I was clear enough here about what it means to feel persecuted.
So the way I’m using that phrase, ‘feeling persecuted’ is not desirable whether you are actually being persecuted or not.
‘Feeling persecuted’ means feeling helpless, powerless, or otherwise victimized. Feeling like the universe is against you or your tribe, and that things are (in some sense) inherently bad and may forever be bad, and that nothing can be done.
If, indeed, you are part of a group that has fewer rights and privileges than the dominant groups, you can acknowledge to yourself “my people don’t have the same rights as other people” but you don’t have to feel any sense of persecution around that. You can just see that it is true and happening, without feeling helpless and like something is inherently broken or that you are inherently broken.
Seeing through the egregore would help a person realize that ‘oh there is an egregore feeding on my beliefs about being persecuted but it’s not actually a fundamental truth about the world; things can actually be different; and I’m not defined by my victimhood. maybe i should stop feeding this egregore with these thoughts and feelings that don’t actually help anything or anyone and isn’t really an accurate representation of reality anyway.’
So I don’t really want to get into this, my note was about the structure of the argument rather than factual claims about the world. But...
I think I feel motte-and-baileyed? When I read your original comment with the term “feel persecuted” I’m like “eh, dunno, sounds plausible I guess?”. When I read it trying to substitute in the definition you give I’m like ”...mm, skeptical”.
Like I get that jargon sometimes just has that effect, I’m not currently saying you shouldn’t use that term with that meaning. But that’s my reaction.
(If you do want a different hook to use, it sounds like “feel persecuted, and also be clinically depressed” is tongue-in-cheek kinda close to what you describe? Though bringing in the concept of “depression”, and especially “clinical” depression, may not help see things clearly either.)
No, it’s definitely not about being depressed. That’s very far from it. But I also don’t want to argue about the claims here. Seems maybe beside the point.
I think I could reword my original argument in a way that wouldn’t be a problem. I just wasn’t careful in my languaging, but I personally think it’s fine? I think you might be reading a lot into my usage of the word “So”.
The Ivermectin case seemed that journals are not important to Scientism. Nobody cared about peer-reviewed meta-analysis when those went counter to institutional positions.
Uhm, some kind of Comfort Zone Expansion? Make people participate in competitions where they will predictably lose; and then they realize that life goes on.
Also, some kind of: “you should sincerely hope that you are not the smartest person on this planet, because if the smartest person on this planet is you, then frankly we are all going to die (and yes, this includes you)… but if there are many people smarter than you, then perhaps we might survive the AI” perspective.
Another reframe: “ego is for losers who are incapable of facing the reality. what is true is already true...”.
If everyone is special, no one is. But in reality, some people are special-special, and most people are just ordinary-special. Statistically, you most likely belong to the latter group.
But the good news is that you have something to offer even if you are not special! Some things need to be done repeatedly, or at multiple places (such as organizing a local LessWrong meetup). Some things are important but not the most important, which is why the special-special people do not have enough time to do them, so it’s up to you.
Be honest and admit that it’s not about what you can offer, but what status you hope to get in return.
What a hypocrisy! If only highly intelligent people are worthy, then most humans never got a real opportunity. But those don’t matter, I suppose; only the members of the intellectual elite should have the same opportunities. You are a member of the elite, but are not a member of the elite-within-elite. Congratulations, now you can better empathize with the intellectual 98%.
People who are not fit for the elite work can still be welcome in the rationalist community.
Wait, don’t generalize so quickly. Perhaps being in Bay Area is playing on hard mode, from this perspective. Move to a place where people are… more emotionally capable of being told they are not the planet’s #1.