I’ll try here to summarize (my guess at) your views, Adele. Please let me know what I’m getting right and wrong. And also if there are points you care about that I left out.
I think you think:
(1) Psychotic episodes are quite bad for people when they happen.
(2) They happen a lot more (than gen population base rates) around the rationalists.
(2a) They also happen a lot more (than gen population base rates) among “the kinds of people we attract.” You’re not sure whether we’re above the base rate for “the kinds of people who would be likely to end up here.” You also don’t care much about that question.
(3) There are probably things we as a community can tractably do to significantly reduce the number of psychotic episodes, in a way that is good or not-bad for our goals overall.
(4) People such as Brent caused/cause psychotic episodes sometimes, or increase their rate in people with risk factors or something.
(5) You’re not sure whether CFAR workshops were more psychosis-risky than other parts of the rationalist community.
(6) You think CFAR leadership, and leadership of the rationality community broadly, had and has a duty to try to reduce the number of psychotic episodes in the rationalist community at large, not just events happening at / directly related to CFAR workshops.
(6b) You also think CFAR leadership failed to perform this duty.
(7) You think you can see something of the mechanisms whereby psyches sometimes have psychotic episodes, and that this view affords some angles for helping prevent such episodes.
(8) Separately from “7”, you think psychotic episodes are in some way related to poor epistemics (e.g., psychotic people form really false models of a lot of basic things), and you think it should probably be possible to create “rationality techniques” or “cogsec techniques” or something that simultaneously improve most peoples’ overall epistemics, and reduce peoples’ vulnerability to psychosis.
My own guesses are that CFAR mostly paid an [amount of attention that made sense] to reducing psychosis/mania risks in the workshop context, after our initial bad experience with the mania/psychosis episode at an early workshop when we did not yet realize this could be a thing.
The things we did:
tried to screen for instablity;
tried to warn people who we thought might have some risk factors (but not enough risk factors that we were screening them out) after accepting them to the workshop, and before they’d had a chance to say yes. (We’d standardly say something like: “we don’t ask questions this nosy, and you’re already in regardless, but, just so you know, there’s some evidence that workshops of all sorts, probably including CFAR workshops, may increase risks of mania or psychosis in people with vulnerability to that, so if you have any sort of psychiatric history you may want to consider either not coming, or talking about it with a psychiatrist before coming.”)
try to train our instructors and “mentors” (curriculum volunteers) to notice warning signs. check in as a staff regularly to see if anyone had noticed any warning signs for any participants. if sensible, talk to the participant to encourage them to sleep more, skip classes, avoid recreational drugs for awhile, do normal grounding activities, etc. (This happened relatively often — maybe once every three workshops — but was usually a relatively minor matter. Eg this would be a person who was having trouble sleeping and who perhaps thought they had a chance at solving [some long-standing personal problem they’d previously given up on] “right now” a way that weirded us out, but who also seemed pretty normal and reasonable still.)
I separately think I put a reasonable amount of effort into organizing basic community support and first aid for those who were socially contiguous with me/CFAR who were having acutely bad mental health times, although my own capacities weren’t enough for a growing community and I mostly gave up on the less near-me parts around 2018.
It mostly did not occur to me to contemplate our cultural impact on the community’s overall psychosis rate (except for trying for awhile to discourage tulpas and other risky practices, and to discourage associating with people who did such things, and then giving up on this around 2018 when it seemed to me there was no real remaining chance of quarantining these practices).
I like the line of inquiry about “what art of rationality might be both good in itself, and increase peoples’ robustness / decrease their vulnerability to mania/psychosis-type failure modes, including much milder versions that may be fairly common in these parts and that are still bad”. I’ll be pursuing it. I take your point that I could in principle have pursued it earlier.
If we are going to be doing a fault analysis in which we give me and CFAR responsibility for some of our downstream memetic effects, I’d like CFAR to also get some credit for any good downstream memetic effects we had. My own guess is that CFAR workshops:
made it possible for EA and “the rationalist community” to expand a great deal without becoming nearly as “diluted”/“normie” as would’ve happened by default, with that level of immigration-per-year;
helped many “straw lesswrongers” to become more “agenty” and realize “problems are for solving” instead of sort of staring helplessly at their todo lists and desires, and that this part made the rationalist community stronger and healthier
helped a fair number of people to become less “straw EA” in the sense of “my only duty is to do the greatest good for the greatest number, while ignoring my feelings”, and to tune in a bit more to some of the basics of healthy life, sometimes.
I acknowledge that these alleged benefits are my personal guesses and may be wrong. But these guesses seem on par to me with my personal guess that patterns of messing with one’s own functioning (as from “CFAR techniques”) can erode psychological wholeness, and I’m afraid it’ll be confusing if I voice only the negative parts of my personal guesses.
(1) Yes (2) Yes (2a) I think I feel sure about that actually. It’s not that I don’t care for the question as much as I feel it’s being used as an excuse for inaction/lack-of-responsibility. (3) Yes, and I think the case for that is made even stronger by the fact of 2a. (4) I don’t know that Brent did that specifically, but I have heard quite a lot of rumors of various people pushing extreme techniques/practices in maliciously irresponsible ways. Brent was emblematic of the sort of tolerance towards this sort of behavior I have seen. I’ve largely withdrawn from the community (in part due to stuff like this), and am no longer on twitter/x, facebook, discord, or go to community events, so it’s plausible things are actually better now and I just haven’t seen it. (5) Yeah, I’m not sure… I used to feel excited about CFAR, but that sentiment soured over the years for reasons illegible to me, and I felt a sense of relief when it died. After reflecting yesterday, I think I may have a sort of negative halo effect here.
Also, I think the psychosis incidents are the extremal end of some sort of badness that (specific, but unknown to me) rationality ideas are having on people. (6) Yes, inasmuch as the psychosis is being caused by ideas or people from our sphere. (6b) It appears that way to me, but I don’t actually know. (7) Yes (8) Yes. Like, say you ran a aikido dojo or whatever. Several students tear their ACLs (maybe outside of the dojo). One response might be to note that your students are mostly white, and that white people are more likely to tear their ACL, so… sucks but isn’t your problem. Another response would be to get curious about why an ACL tear happens, look for specific muscles to train up to prevent risk of injury, or early warning signs, what training exercises are potentially implicated etc.… While looking into it, you warn the students clearly that this seems to be a risk, try to get a sense of who is vulnerable and not push those people as hard, and once some progress has been identified, dedicate some time to doing exercises or whatever which mitigate this risk. And kick out the guy encouraging everyone to do heavy sets of “plant and twist” exercises (“of course it’s risky bro, any real technique is gonna be like that”).
My complaint is basically that I think the second response is obviously much better, but the actual response has been closer to the first response.
I’ll try here to summarize (my guess at) your views, Adele. Please let me know what I’m getting right and wrong. And also if there are points you care about that I left out.
I think you think:
(1) Psychotic episodes are quite bad for people when they happen.
(2) They happen a lot more (than gen population base rates) around the rationalists.
(2a) They also happen a lot more (than gen population base rates) among “the kinds of people we attract.” You’re not sure whether we’re above the base rate for “the kinds of people who would be likely to end up here.” You also don’t care much about that question.
(3) There are probably things we as a community can tractably do to significantly reduce the number of psychotic episodes, in a way that is good or not-bad for our goals overall.
(4) People such as Brent caused/cause psychotic episodes sometimes, or increase their rate in people with risk factors or something.
(5) You’re not sure whether CFAR workshops were more psychosis-risky than other parts of the rationalist community.
(6) You think CFAR leadership, and leadership of the rationality community broadly, had and has a duty to try to reduce the number of psychotic episodes in the rationalist community at large, not just events happening at / directly related to CFAR workshops.
(6b) You also think CFAR leadership failed to perform this duty.
(7) You think you can see something of the mechanisms whereby psyches sometimes have psychotic episodes, and that this view affords some angles for helping prevent such episodes.
(8) Separately from “7”, you think psychotic episodes are in some way related to poor epistemics (e.g., psychotic people form really false models of a lot of basic things), and you think it should probably be possible to create “rationality techniques” or “cogsec techniques” or something that simultaneously improve most peoples’ overall epistemics, and reduce peoples’ vulnerability to psychosis.
My own guesses are that CFAR mostly paid an [amount of attention that made sense] to reducing psychosis/mania risks in the workshop context, after our initial bad experience with the mania/psychosis episode at an early workshop when we did not yet realize this could be a thing.
The things we did:
tried to screen for instablity;
tried to warn people who we thought might have some risk factors (but not enough risk factors that we were screening them out) after accepting them to the workshop, and before they’d had a chance to say yes. (We’d standardly say something like: “we don’t ask questions this nosy, and you’re already in regardless, but, just so you know, there’s some evidence that workshops of all sorts, probably including CFAR workshops, may increase risks of mania or psychosis in people with vulnerability to that, so if you have any sort of psychiatric history you may want to consider either not coming, or talking about it with a psychiatrist before coming.”)
try to train our instructors and “mentors” (curriculum volunteers) to notice warning signs. check in as a staff regularly to see if anyone had noticed any warning signs for any participants. if sensible, talk to the participant to encourage them to sleep more, skip classes, avoid recreational drugs for awhile, do normal grounding activities, etc. (This happened relatively often — maybe once every three workshops — but was usually a relatively minor matter. Eg this would be a person who was having trouble sleeping and who perhaps thought they had a chance at solving [some long-standing personal problem they’d previously given up on] “right now” a way that weirded us out, but who also seemed pretty normal and reasonable still.)
I separately think I put a reasonable amount of effort into organizing basic community support and first aid for those who were socially contiguous with me/CFAR who were having acutely bad mental health times, although my own capacities weren’t enough for a growing community and I mostly gave up on the less near-me parts around 2018.
It mostly did not occur to me to contemplate our cultural impact on the community’s overall psychosis rate (except for trying for awhile to discourage tulpas and other risky practices, and to discourage associating with people who did such things, and then giving up on this around 2018 when it seemed to me there was no real remaining chance of quarantining these practices).
I like the line of inquiry about “what art of rationality might be both good in itself, and increase peoples’ robustness / decrease their vulnerability to mania/psychosis-type failure modes, including much milder versions that may be fairly common in these parts and that are still bad”. I’ll be pursuing it. I take your point that I could in principle have pursued it earlier.
If we are going to be doing a fault analysis in which we give me and CFAR responsibility for some of our downstream memetic effects, I’d like CFAR to also get some credit for any good downstream memetic effects we had. My own guess is that CFAR workshops:
made it possible for EA and “the rationalist community” to expand a great deal without becoming nearly as “diluted”/“normie” as would’ve happened by default, with that level of immigration-per-year;
helped many “straw lesswrongers” to become more “agenty” and realize “problems are for solving” instead of sort of staring helplessly at their todo lists and desires, and that this part made the rationalist community stronger and healthier
helped a fair number of people to become less “straw EA” in the sense of “my only duty is to do the greatest good for the greatest number, while ignoring my feelings”, and to tune in a bit more to some of the basics of healthy life, sometimes.
I acknowledge that these alleged benefits are my personal guesses and may be wrong. But these guesses seem on par to me with my personal guess that patterns of messing with one’s own functioning (as from “CFAR techniques”) can erode psychological wholeness, and I’m afraid it’ll be confusing if I voice only the negative parts of my personal guesses.
(1) Yes
(2) Yes
(2a) I think I feel sure about that actually. It’s not that I don’t care for the question as much as I feel it’s being used as an excuse for inaction/lack-of-responsibility.
(3) Yes, and I think the case for that is made even stronger by the fact of 2a.
(4) I don’t know that Brent did that specifically, but I have heard quite a lot of rumors of various people pushing extreme techniques/practices in maliciously irresponsible ways. Brent was emblematic of the sort of tolerance towards this sort of behavior I have seen. I’ve largely withdrawn from the community (in part due to stuff like this), and am no longer on twitter/x, facebook, discord, or go to community events, so it’s plausible things are actually better now and I just haven’t seen it.
(5) Yeah, I’m not sure… I used to feel excited about CFAR, but that sentiment soured over the years for reasons illegible to me, and I felt a sense of relief when it died. After reflecting yesterday, I think I may have a sort of negative halo effect here.
Also, I think the psychosis incidents are the extremal end of some sort of badness that (specific, but unknown to me) rationality ideas are having on people.
(6) Yes, inasmuch as the psychosis is being caused by ideas or people from our sphere.
(6b) It appears that way to me, but I don’t actually know.
(7) Yes
(8) Yes. Like, say you ran a aikido dojo or whatever. Several students tear their ACLs (maybe outside of the dojo). One response might be to note that your students are mostly white, and that white people are more likely to tear their ACL, so… sucks but isn’t your problem. Another response would be to get curious about why an ACL tear happens, look for specific muscles to train up to prevent risk of injury, or early warning signs, what training exercises are potentially implicated etc.… While looking into it, you warn the students clearly that this seems to be a risk, try to get a sense of who is vulnerable and not push those people as hard, and once some progress has been identified, dedicate some time to doing exercises or whatever which mitigate this risk. And kick out the guy encouraging everyone to do heavy sets of “plant and twist” exercises (“of course it’s risky bro, any real technique is gonna be like that”).
My complaint is basically that I think the second response is obviously much better, but the actual response has been closer to the first response.