I don’t actually know baseline rates or rationalist-rates (perhaps someone wants to answer with data from annual rationalist census/survey questions?), so I’m not sure to what extent there is an observation here to explain.
But it does seem to me that there is more of it than baseline; and I think a first explanation has to be a lot of selection effects? I think people likely to radically change their mind about the world and question consensus and believe things that are locally socially destabilizing (e.g. “there is no God” “I am not the gender that matches my biological sex” “the whole world might end soon” etc) are more likely to be (relatively) psychologically unstable people.
Like, some of the people who I think have psychotic/manic episodes around us, are indeed people who you could tell from the first 10 minutes that they were psychologically different from those around them. For example, I once observed someone at a rationalist event failing to follow a simple physical instruction, whilst seeming to not realize they weren’t successfully following the instruction, and I got a distinct crazy-alarm from them; I later learned that they had been institutionalized a lot earlier in their life with psychotic episodes and were religious.
Still, I do think I’ve seen immense mental strain put on people otherwise relatively psychologically healthy. I think a lot of people do very hard things with little support net, that have caused them very bad experiences. But on the other hand, I really see very few severely bad mental episodes in the people I actually know and meet (I’m struggling to think of a single one in recent years). And for events I run, I generally select against people who exhibit strong signs of mental instability. I don’t want them to explode, have terrible experience, and cause other people terrible experiences.
Probably CFAR has had much more risk of this than Lightcone? CFAR more regularly has strangers come to an intense event for many days in a row about changing your mind and your identity to become stronger, disconnected from your normal life that whole time, whereas I have run fewer such events (perhaps Inkhaven! that will be the most intense event I have run. I have just made a note to my team to have check-ins about how the residents are on this dimension, thanks for prompting that to happen).
I’m not really sure what observations Adele is making / thinking about, and would be interested to read more of those (anonymized, or abstracted, naturally).
Added: I just realized that perhaps Adele just wanted this thread to be between Adele/Anna. Oops, if so.
I don’t dispute that strong selection effects are at play, as I mentioned earlier.
My contention is with the fact that even among such people, psychosis doesn’t just happen at random. There is still an inciting incident, and it often seems that rationalist-y ideas are implicated. More broadly, I feel that there is a cavalier attitude towards doing mentally destabilizing things. And like, if we know we’re prone to this, why aren’t we taking it super seriously?
The change I want to have happen is for there to be more development of mental techniques/principles for becoming more mentally robust, and for this to be framed as a prerequisite for the Actually Changing Your Mind (and other potentially destabilizing) stuff. Maybe substantial effort has been put into this that I haven’t seen. But I would have hoped to have seen some sort of community moment of “oh shit, why does this keep happening?!? let’s work together to understand it and figure out how to prevent or protect against it”. And in the meantime: more warnings, the way I feel that “meditation” has been more adequately warned of.
Thanks for deciding to do the check-ins; that makes me glad to have started this conversation, despite how uncomfortable confrontation feels for me still. I feel like part of the problem is that this is just an uncomfortable thing to talk about.
My illegible impression is that Lightcone is better at this than past-CFAR was, for a deeper reason than that. (Okay, the Brent Dill drama feels relevant.)
I’m mostly thinking about cases from years ago, when I was still trying to socially be a part of the community (before ~2018?). There was one person in the last year or so who I was interested in becoming friends with that this then happened to, which made me think it continues to be a problem, but it’s possible I over-updated. My models are mainly coming from the AI psychosis cases I’ve been researching.
I would like to have the kind of debate where anything is allowed to be said and nothing is taboo
this kind of debate, combined with some intense extreme thoughts, causes some people to break down
it feels wrong to dismiss people as “not ready for this kind of debate”, and we probably can’t do it reliably
The first point because “what is true, is already true”; and also because things are connected, and when X is connected to Y, being wrong about X probably also makes you somewhat wrong about Y.
The second point because people are different, in how resilient they are to horrible thoughts, how sheltered they have been so far, whether they have specific traumas and triggers. What sounds like an amusing thought experiment to one can be a horrifying nightmare to another; and the rationalist ethos of taking ideas seriously only makes it worse as it disables the usual protection mechanisms of the mind.
The third point because many people in the rationality community are contrarians by nature, and telling them “could you please not do X” only makes it guaranteed that X will happen, and explaining them why X is a bad idea only results in them explaining to you why you are wrong. Then there is the strong belief in the Bay Area that excluding anyone is wrong; also various people who have various problems and have been in the past excluded from places would be triggered by the idea of excluding people from the rationality community. Finally, some people would suspect that this is some kind of power move; like, if you support some idea, you might exclude people who oppose this idea as “not mature enough to participate in the hardcore rationalist debates”.
Plus there is this thing that when all debates happen in the open, people already accuse us of being cultish, but if the serious debates started happening behind the closed doors, accessible only to people already vetted e.g. by Anna, I am afraid this might skyrocket. The Protocols of the Elders of TESCREAL would practically write themselves.
You mention the risks associated with meditation… makes me wonder how analogical is the situation. I am not an expert, but it seems to me that with meditation, the main risk is meditation itself. Not hanging out with people who meditate; nor hearing about their beliefs. What is it like with the rationality-community-caused mental breakdowns? Do they only happen at minicamps? Or is exposure to the rationality community enough? Can people get crazy by merely reading the Sequences? By hanging out at Less Wrong meetups?
I agree that the safety of the new members in the rationality community seems neglected. In the past I have suggested that someone should write a material on dangers related to our community, that each new member should read. The things I had in mind were more like “you could be exploited by people like Brent Dill” rather than psychosis, but all bad things should be mentioned there. (Analogically to the corporate safety trainings in my company, which remind us not to do X, Y, Z, illustrated by anonymized stories about bad things that happened when people did X, Y, Z in the past.) Sadly, I am too lazy to write it.
I think there’s a broader property that makes people not-psychotic, that many things in the bay area and in the practice of “rationality” (not the ideal art, but the thing folks do) chip away at.
I believe the situation is worse among houses full of unemployed/underemployed people at the outskirts of the community than it is among people who work at central rationalist/EA/etc organizations or among people who could pay for a CFAR workshop. (At least, I believe this was so before covid; I’ve been mostly out of touch since leaving the bay in early 2020.)
This “broader property” is something like: “the world makes sense to me (on many levels: intuitive, emotional, cognitive, etc), and I have meaningful work that is mundane and full of feedback loops and that I can tell does useful things (eg I can tell that after I feed my dog he is fed), and many people are counting on me in mundane ways, and my friends will express surprise and check in with me if I start suddenly acting weird, and my rough models are in rough synchrony also with the social world around me and with the physical systems I am interacting with, and my friends are themselves sane and reasonable and oriented to my world such that it works fine for me to update off their opinions, and lots of different things offer useful checksums on lots of different aspects of my functioning in a non-totalizing fashion.”
I think there are ways of doing debate (even “where nothing is taboo”) that are relatively more supportive of this “broader property.” Eg, it seems helpful to me to spend some time naming common ground (“we disagree about X, and we’ll spend some time trying to convince each other of X/not-X, but regardless, here’s some neighboring things we agree about and are likely to keep agreeing about”). Also to notice that material reality has a lot of detail, and that there are many different questions and factors that may affect (AI or whatever) that don’t correlate that much with each other.
houses full of unemployed/underemployed people at the outskirts of the community
Oh, this wasn’t even a part of my mental model! (I wonder what other things am I missing that are so obvious for the local people that no one even mentions them explicitly.)
My first reaction is a shocked disbelief, how can there be such a thing as “unemployed… rationalist… living in Bay Area”, and even “houses full of them”...
This goes against my several assumptions such as “Bay Area is expensive”, “most rationalists are software developers”, “there is a shortage of software developers on the market”, “there is a ton of software companies in Bay Area”, and maybe even “rationalists are smart and help each other”.
Here (around the Vienna community) I think everyone is either a student or employed. And if someone has a bad job, the group can brainstorm how to help them. (We had one guy who was a nurse, everyone told him that he should learn to code, he attended a 6-month online bootcamp and then got a well-paying software development job.) I am literally right now asking our group on Telegram to confirm or disconfirm this.
Thank you; to put it bluntly, I am no longer surprised that some of the people who can’t hold a job would be deeply dysfunctional in other ways, too. The surprising part is that you consider them a part of the rationalist community. What did they do to deserve this honor? Memorized a few keywords? Impressed other people with skills unrelated to being able to keep a job? What the fuck is wrong with everyone? Is this a rationalist community or a psychotic homeless community or what?
...taking a few deep breaths...
I wonder which direction the causality goes. Is it “people who are stabilized in ways such as keeping a job, will remain sane” or rather “people who are sane, find it easier to get a job”. The second option feels more intuitive to me. But of course I can imagine it being a spiral.
it seems helpful to me to spend some time naming common ground
Yes, but another option is to invite people whose way of life implies some common ground. Such as “the kind of people who could get a job if they wanted one”.
I imagine that in Vienna, the community is small enough that if someone gets excited by rationalist ideas and wants to meet with other rationalists in person, there essentially is just the one group. And also, it sounds like this group is small enough that having a group brainstorm to help a specific community member is viable.
In the Bay Area, it’s large enough that there are several cliques which someone excited by rationalist ideas might fall into, and there’s not a central organization which has the authority to say which ones are or aren’t rationalist, nor is there a common standard for rationalists. It’s also not clear which cliques (if any) a specific person is in when you meet them at a party or whatever, so even though there are cliques with bad reputations, it’s hard to decisively exclude them. (And also, Inner Ring dynamics abound.)
As for the dysfunctional houses thing, what seems to happen something like: Wow, this rationalism stuff is great, and the Bay Area is the place to be! I’ll move there and try to get a software job. I can probably teach myself to code in just a couple months, and being surrounded by other rationalists will make it easier. But gosh, is housing really that expensive? Oh, but there are all these group houses! Well, this one is the only one I could afford and that had room for me, so I guess I’ll stay here until I get a proper job. Hmm, is that mold? Hopefully someone takes care of that… And ugh, why are all my roommates sucking me into their petty drama?! Ughhhh, I really should start applying for jobs—damn this akrasia! I should focus on solving that before doing anything else. Has it really been 6 months already? Oh, LSD solved your akrasia? Seems worth a try. Oh, you’ll be my trip-sitter and guide me through the anti-akrasia technique you developed? Awesome! Woah, I wasn’t sure about your egregores-are-eating-people’s-souls thing, but now I see it everywhere...
This is a hard problem for the community-at-large to solve, since it’s not visible to anyone who could offer some real help until it’s too late. I think the person in the vignette would have done fine in Vienna. And the expensive housing is a large factor here, it makes it much harder to remove yourself from a bad situation, and constantly eats up your slack. But I do think the community has been negligent and reckless in certain ways which exacerbate this problem, and that is what my criticism of CFAR here is about. Specifically, contributing towards a culture where people try and share all these dubious mental techniques that will supposedly solve their problems, and a culture where bad actors are tolerated for far too long. I’m sure there are plenty of other things we’re doing wrong too.
Thank you, the description is hilarious and depressing at the same time. I think I get it. (But I suspect there are also people who were already crazy when they came.)
I am probably still missing a lot of context, but the first idea that comes to my mind, is to copy the religious solution and do something like the Sunday at church, to synchronize the community. Choose a specific place and a repeating time (could be e.g. every other Saturday or whatever) where the rationalists are invited to come and listen to some kind of news and lectures.
Importantly, the news and lectures would be given by people vetted by the leaders of the rationality community. (So that e.g. Ziz cannot come and give a lecture on bicameral sleep.) I imagine e.g. 2 or 3 lectures/speeches on various topics that could be of interest to rationalists, and then someone give a summary about what things interesting to the community have happened since the last event, and what is going to happen before the next one. Afterwards, people either go home, or hang out together in smaller groups unofficially.
This would make it easier to communicate stuff to the community at large, and also draw a line between what is “officially endorsed” and what is not.
(I know how many people are allergic to copy religious things—making a huge exception for Buddhism, or course—but they do have a technology for handling some social problems.)
The surprising part is that you consider them a part of the rationalist community. What did they do to deserve this honor?
(Noting again that I’m speaking only of the pre-2020 situation, as I lack much recent info) Many don’t consider them part of “the” community. This is part of how they come to be not-helped by the more mainstream/healthy parts.
However: they are seeded by people who were deeply affected by Eliezer’s writing, and who wanted to matter for AI risk, and who grabbed some tools and practices from what you would regard as the rationality community, and who then showed their friends their “cool mind-tools” etc., with the memes evolving from there.
Also, it at least used to be that there was no crisp available boundary: one’s friends will sometimes have friendships that reach beyond, and so habits will move from what I’m calling the “periphery” into the “mainstream” and back.
The social puzzle faced by bay area rationalists is harder than that faced by eg Boston-area rationalists, owing mostly I think to the sheer size of the bay area rationality community.
Then there is the strong belief in the Bay Area that excluding anyone is wrong; also various people who have various problems and have been in the past excluded from places would be triggered by the idea of excluding people from the rationality community.
I just want to say that, while it has in the past been the case that a lot of people were very anti-exclusion, and some people are still that way, I certainly am not and this does not accurately describe Lightcone, and regularly we are involved in excluding or banning people for bad behavior. Most major events we are involved in running of a certain size have involved some amount of this.
I think this is healthy and necessary and the attempt to include everyone or always make sure that whatever stray cat shows up on your doorstep can live in your home, is very unhealthy and led to a lot of past problems and hurtful dynamics.
(There’s lots more details to this and how to do justice well that I’m skipping over, right now I’m just replying to this narrow point.)
Added: I just realized that perhaps Adele just wanted this thread to be between Adele/Anna. Oops, if so.
I’d like comments from all interested parties, and I’m pretty sure Adele would too! She started it on my post about the new pilot CFAR workshops, and I asked if she’d move it here, but she mentioned wanting more people to engage, and you (or others) talking seems great for that.
I don’t actually know baseline rates or rationalist-rates (perhaps someone wants to answer with data from annual rationalist census/survey questions?), so I’m not sure to what extent there is an observation here to explain.
But it does seem to me that there is more of it than baseline; and I think a first explanation has to be a lot of selection effects? I think people likely to radically change their mind about the world and question consensus and believe things that are locally socially destabilizing (e.g. “there is no God” “I am not the gender that matches my biological sex” “the whole world might end soon” etc) are more likely to be (relatively) psychologically unstable people.
Like, some of the people who I think have psychotic/manic episodes around us, are indeed people who you could tell from the first 10 minutes that they were psychologically different from those around them. For example, I once observed someone at a rationalist event failing to follow a simple physical instruction, whilst seeming to not realize they weren’t successfully following the instruction, and I got a distinct crazy-alarm from them; I later learned that they had been institutionalized a lot earlier in their life with psychotic episodes and were religious.
Still, I do think I’ve seen immense mental strain put on people otherwise relatively psychologically healthy. I think a lot of people do very hard things with little support net, that have caused them very bad experiences. But on the other hand, I really see very few severely bad mental episodes in the people I actually know and meet (I’m struggling to think of a single one in recent years). And for events I run, I generally select against people who exhibit strong signs of mental instability. I don’t want them to explode, have terrible experience, and cause other people terrible experiences.
Probably CFAR has had much more risk of this than Lightcone? CFAR more regularly has strangers come to an intense event for many days in a row about changing your mind and your identity to become stronger, disconnected from your normal life that whole time, whereas I have run fewer such events (perhaps Inkhaven! that will be the most intense event I have run. I have just made a note to my team to have check-ins about how the residents are on this dimension, thanks for prompting that to happen).
I’m not really sure what observations Adele is making / thinking about, and would be interested to read more of those (anonymized, or abstracted, naturally).
Added: I just realized that perhaps Adele just wanted this thread to be between Adele/Anna. Oops, if so.
I don’t dispute that strong selection effects are at play, as I mentioned earlier.
My contention is with the fact that even among such people, psychosis doesn’t just happen at random. There is still an inciting incident, and it often seems that rationalist-y ideas are implicated. More broadly, I feel that there is a cavalier attitude towards doing mentally destabilizing things. And like, if we know we’re prone to this, why aren’t we taking it super seriously?
The change I want to have happen is for there to be more development of mental techniques/principles for becoming more mentally robust, and for this to be framed as a prerequisite for the Actually Changing Your Mind (and other potentially destabilizing) stuff. Maybe substantial effort has been put into this that I haven’t seen. But I would have hoped to have seen some sort of community moment of “oh shit, why does this keep happening?!? let’s work together to understand it and figure out how to prevent or protect against it”. And in the meantime: more warnings, the way I feel that “meditation” has been more adequately warned of.
Thanks for deciding to do the check-ins; that makes me glad to have started this conversation, despite how uncomfortable confrontation feels for me still. I feel like part of the problem is that this is just an uncomfortable thing to talk about.
My illegible impression is that Lightcone is better at this than past-CFAR was, for a deeper reason than that. (Okay, the Brent Dill drama feels relevant.)
I’m mostly thinking about cases from years ago, when I was still trying to socially be a part of the community (before ~2018?). There was one person in the last year or so who I was interested in becoming friends with that this then happened to, which made me think it continues to be a problem, but it’s possible I over-updated. My models are mainly coming from the AI psychosis cases I’ve been researching.
As I see it, the problem is the following:
I would like to have the kind of debate where anything is allowed to be said and nothing is taboo
this kind of debate, combined with some intense extreme thoughts, causes some people to break down
it feels wrong to dismiss people as “not ready for this kind of debate”, and we probably can’t do it reliably
The first point because “what is true, is already true”; and also because things are connected, and when X is connected to Y, being wrong about X probably also makes you somewhat wrong about Y.
The second point because people are different, in how resilient they are to horrible thoughts, how sheltered they have been so far, whether they have specific traumas and triggers. What sounds like an amusing thought experiment to one can be a horrifying nightmare to another; and the rationalist ethos of taking ideas seriously only makes it worse as it disables the usual protection mechanisms of the mind.
The third point because many people in the rationality community are contrarians by nature, and telling them “could you please not do X” only makes it guaranteed that X will happen, and explaining them why X is a bad idea only results in them explaining to you why you are wrong. Then there is the strong belief in the Bay Area that excluding anyone is wrong; also various people who have various problems and have been in the past excluded from places would be triggered by the idea of excluding people from the rationality community. Finally, some people would suspect that this is some kind of power move; like, if you support some idea, you might exclude people who oppose this idea as “not mature enough to participate in the hardcore rationalist debates”.
Plus there is this thing that when all debates happen in the open, people already accuse us of being cultish, but if the serious debates started happening behind the closed doors, accessible only to people already vetted e.g. by Anna, I am afraid this might skyrocket. The Protocols of the Elders of TESCREAL would practically write themselves.
You mention the risks associated with meditation… makes me wonder how analogical is the situation. I am not an expert, but it seems to me that with meditation, the main risk is meditation itself. Not hanging out with people who meditate; nor hearing about their beliefs. What is it like with the rationality-community-caused mental breakdowns? Do they only happen at minicamps? Or is exposure to the rationality community enough? Can people get crazy by merely reading the Sequences? By hanging out at Less Wrong meetups?
I agree that the safety of the new members in the rationality community seems neglected. In the past I have suggested that someone should write a material on dangers related to our community, that each new member should read. The things I had in mind were more like “you could be exploited by people like Brent Dill” rather than psychosis, but all bad things should be mentioned there. (Analogically to the corporate safety trainings in my company, which remind us not to do X, Y, Z, illustrated by anonymized stories about bad things that happened when people did X, Y, Z in the past.) Sadly, I am too lazy to write it.
I think there’s a broader property that makes people not-psychotic, that many things in the bay area and in the practice of “rationality” (not the ideal art, but the thing folks do) chip away at.
I believe the situation is worse among houses full of unemployed/underemployed people at the outskirts of the community than it is among people who work at central rationalist/EA/etc organizations or among people who could pay for a CFAR workshop. (At least, I believe this was so before covid; I’ve been mostly out of touch since leaving the bay in early 2020.)
This “broader property” is something like: “the world makes sense to me (on many levels: intuitive, emotional, cognitive, etc), and I have meaningful work that is mundane and full of feedback loops and that I can tell does useful things (eg I can tell that after I feed my dog he is fed), and many people are counting on me in mundane ways, and my friends will express surprise and check in with me if I start suddenly acting weird, and my rough models are in rough synchrony also with the social world around me and with the physical systems I am interacting with, and my friends are themselves sane and reasonable and oriented to my world such that it works fine for me to update off their opinions, and lots of different things offer useful checksums on lots of different aspects of my functioning in a non-totalizing fashion.”
I think there are ways of doing debate (even “where nothing is taboo”) that are relatively more supportive of this “broader property.” Eg, it seems helpful to me to spend some time naming common ground (“we disagree about X, and we’ll spend some time trying to convince each other of X/not-X, but regardless, here’s some neighboring things we agree about and are likely to keep agreeing about”). Also to notice that material reality has a lot of detail, and that there are many different questions and factors that may affect (AI or whatever) that don’t correlate that much with each other.
Oh, this wasn’t even a part of my mental model! (I wonder what other things am I missing that are so obvious for the local people that no one even mentions them explicitly.)
My first reaction is a shocked disbelief, how can there be such a thing as “unemployed… rationalist… living in Bay Area”, and even “houses full of them”...
This goes against my several assumptions such as “Bay Area is expensive”, “most rationalists are software developers”, “there is a shortage of software developers on the market”, “there is a ton of software companies in Bay Area”, and maybe even “rationalists are smart and help each other”.
Here (around the Vienna community) I think everyone is either a student or employed. And if someone has a bad job, the group can brainstorm how to help them. (We had one guy who was a nurse, everyone told him that he should learn to code, he attended a 6-month online bootcamp and then got a well-paying software development job.) I am literally right now asking our group on Telegram to confirm or disconfirm this.
Thank you; to put it bluntly, I am no longer surprised that some of the people who can’t hold a job would be deeply dysfunctional in other ways, too. The surprising part is that you consider them a part of the rationalist community. What did they do to deserve this honor? Memorized a few keywords? Impressed other people with skills unrelated to being able to keep a job? What the fuck is wrong with everyone? Is this a rationalist community or a psychotic homeless community or what?
...taking a few deep breaths...
I wonder which direction the causality goes. Is it “people who are stabilized in ways such as keeping a job, will remain sane” or rather “people who are sane, find it easier to get a job”. The second option feels more intuitive to me. But of course I can imagine it being a spiral.
Yes, but another option is to invite people whose way of life implies some common ground. Such as “the kind of people who could get a job if they wanted one”.
I imagine that in Vienna, the community is small enough that if someone gets excited by rationalist ideas and wants to meet with other rationalists in person, there essentially is just the one group. And also, it sounds like this group is small enough that having a group brainstorm to help a specific community member is viable.
In the Bay Area, it’s large enough that there are several cliques which someone excited by rationalist ideas might fall into, and there’s not a central organization which has the authority to say which ones are or aren’t rationalist, nor is there a common standard for rationalists. It’s also not clear which cliques (if any) a specific person is in when you meet them at a party or whatever, so even though there are cliques with bad reputations, it’s hard to decisively exclude them. (And also, Inner Ring dynamics abound.)
As for the dysfunctional houses thing, what seems to happen something like: Wow, this rationalism stuff is great, and the Bay Area is the place to be! I’ll move there and try to get a software job. I can probably teach myself to code in just a couple months, and being surrounded by other rationalists will make it easier. But gosh, is housing really that expensive? Oh, but there are all these group houses! Well, this one is the only one I could afford and that had room for me, so I guess I’ll stay here until I get a proper job. Hmm, is that mold? Hopefully someone takes care of that… And ugh, why are all my roommates sucking me into their petty drama?! Ughhhh, I really should start applying for jobs—damn this akrasia! I should focus on solving that before doing anything else. Has it really been 6 months already? Oh, LSD solved your akrasia? Seems worth a try. Oh, you’ll be my trip-sitter and guide me through the anti-akrasia technique you developed? Awesome! Woah, I wasn’t sure about your egregores-are-eating-people’s-souls thing, but now I see it everywhere...
This is a hard problem for the community-at-large to solve, since it’s not visible to anyone who could offer some real help until it’s too late. I think the person in the vignette would have done fine in Vienna. And the expensive housing is a large factor here, it makes it much harder to remove yourself from a bad situation, and constantly eats up your slack. But I do think the community has been negligent and reckless in certain ways which exacerbate this problem, and that is what my criticism of CFAR here is about. Specifically, contributing towards a culture where people try and share all these dubious mental techniques that will supposedly solve their problems, and a culture where bad actors are tolerated for far too long. I’m sure there are plenty of other things we’re doing wrong too.
Thank you, the description is hilarious and depressing at the same time. I think I get it. (But I suspect there are also people who were already crazy when they came.)
I am probably still missing a lot of context, but the first idea that comes to my mind, is to copy the religious solution and do something like the Sunday at church, to synchronize the community. Choose a specific place and a repeating time (could be e.g. every other Saturday or whatever) where the rationalists are invited to come and listen to some kind of news and lectures.
Importantly, the news and lectures would be given by people vetted by the leaders of the rationality community. (So that e.g. Ziz cannot come and give a lecture on bicameral sleep.) I imagine e.g. 2 or 3 lectures/speeches on various topics that could be of interest to rationalists, and then someone give a summary about what things interesting to the community have happened since the last event, and what is going to happen before the next one. Afterwards, people either go home, or hang out together in smaller groups unofficially.
This would make it easier to communicate stuff to the community at large, and also draw a line between what is “officially endorsed” and what is not.
(I know how many people are allergic to copy religious things—making a huge exception for Buddhism, or course—but they do have a technology for handling some social problems.)
(Noting again that I’m speaking only of the pre-2020 situation, as I lack much recent info) Many don’t consider them part of “the” community. This is part of how they come to be not-helped by the more mainstream/healthy parts.
However: they are seeded by people who were deeply affected by Eliezer’s writing, and who wanted to matter for AI risk, and who grabbed some tools and practices from what you would regard as the rationality community, and who then showed their friends their “cool mind-tools” etc., with the memes evolving from there.
Also, it at least used to be that there was no crisp available boundary: one’s friends will sometimes have friendships that reach beyond, and so habits will move from what I’m calling the “periphery” into the “mainstream” and back.
The social puzzle faced by bay area rationalists is harder than that faced by eg Boston-area rationalists, owing mostly I think to the sheer size of the bay area rationality community.
I just want to say that, while it has in the past been the case that a lot of people were very anti-exclusion, and some people are still that way, I certainly am not and this does not accurately describe Lightcone, and regularly we are involved in excluding or banning people for bad behavior. Most major events we are involved in running of a certain size have involved some amount of this.
I think this is healthy and necessary and the attempt to include everyone or always make sure that whatever stray cat shows up on your doorstep can live in your home, is very unhealthy and led to a lot of past problems and hurtful dynamics.
(There’s lots more details to this and how to do justice well that I’m skipping over, right now I’m just replying to this narrow point.)
I’d like comments from all interested parties, and I’m pretty sure Adele would too! She started it on my post about the new pilot CFAR workshops, and I asked if she’d move it here, but she mentioned wanting more people to engage, and you (or others) talking seems great for that.
See context in our original thread.