Carrying the Torch: A Response to Anna Salamon by the Guild of the Rose
In a recent comment and followup post, Anna Salamon described some of the rocks upon which the Center for Applied Rationality has run aground, and invited anyone interested in the project of improving human rationality to pick up the torch.
We wanted to take this opportunity to remind you that the Guild of the Rose is here to carry that torch onward.
On a personal note, I figured I ought to also explicitly mention that the Guild is, in no small part, the result of many parties from all over the world reading and responding to the bat signal that I shined into the sky in 2018. In that post, an alien visitor with a rationality textbook exhorted us to dream a little bigger regarding the potential of the rationality movement. If that post spoke to you, then you ought to know it spoke to a lot of people, and, well, the Guild is what we’re doing about it together.
And I think we’re doing a pretty good job! Anna outlined a list of problems that she felt CFAR ran into, and I figured this would be a good place to describe how the Guild dealt with each of those problems.
Wait a minute – dealt with those problems? Anna just posted a few weeks ago!
When we started the Guild in 2020, we looked to CFAR as both an example to emulate and also differentiate ourselves from. We diagnosed many of the same problems that Anna describes in her post, though not necessarily in the same framing. We designed our organization to avoid those problems. We are grateful to CFAR for having pioneered this path.
Differentiating effective interventions from unfalsifiable woo.
The Guild focuses on actions and habits, not psychotherapy. I think ~0% of what we teach can be called unfalsifiable woo. Even when we tread more esoteric ground (e.g. the decision theory course) we focus on the practical and implementable.
To sketch a perhaps not-totally-generous metaphor, imagine there are two martial arts schools you’re trying to choose between:
One of these schools is esoteric, focuses on dogmatic cultivation of Warrior Spirit, and demands intensive meditation. This school promises a kind of transcendent self-mastery and hints that ki-blasts are not entirely off the table.
The other school focuses on punching bags, footwork drills, takedowns, and sparring. This school promises that if you need to throw hands, you’ll probably come out of it alive.
I think the vast majority of Human Potential Movement-adjacent organizations look more like the first school. Meanwhile, boring-looking organizations like the Boy Scouts of America, which focus almost entirely on the pragmatic practices of how to tie knots and start fires using sticks, probably succeed more at actually cultivating the “human potential” of their members.
Thus, the Guild focuses on the pragmatic. Our workshops cover effective, “boring” interventions like better nutrition, using your speaking voice more effectively, improving your personal financial organization, emergency preparedness, and implementing a knowledge management system, among many others. There is a new workshop almost every week.
Of course, we also teach what could be considered explicit rationality training. We have workshops focusing on epistemics, and on practical decision theory. But it’s our belief that one way to become exceptional is to simply not be below-average at anything important. It is sort of embarrassing to focus extremely hard on “having correct beliefs” while not having a basic survival kit in your car.
Instructors having impure motives, and mistakes amplifying when it comes to rewiring your brain.
We have put some policies in place to mitigate against this kind of thing.
We mitigate against the risks of rewiring members’ brains by not really trying to rewire members’ brains, any more deeply than, say, learning partial differential equations rewires your brain. Learning about decision trees, or vocal control, might open your eyes to broader potentialities and unseen solutions to problems in your life, but doesn’t undermine the bedrock of your identity.
Additionally, our approach is via a weekly 1-3 hour time commitment. Any “rewiring” will be done very very slowly and incrementally. I think a lot of the danger of X-rationality techniques comes from jumping directly into the deep-end and trying to refactor your whole psyche in a week. This is always a bad idea. We don’t advocate trying to do this, and we don’t do this in our workshops.
Finally, instructors simply aren’t given all that much power or control. The leadership vets the course content before inflicting it upon the Guild at large. The leadership council are a varied and skeptical bunch. We don’t allow content or exercises we’re not comfortable with. There is also an implicit norm within the leadership that pushes back against what you might think of as a Slytherin approach to life. We’ve been cautious with content related to social skills, for example, because the potential for bad outcomes is higher.
Insufficient feedback between rationality training and the real world.
As stated above, we focus on smaller, more empirically useful pieces of rationality material, plus practical knowledge and skills. This low-hanging fruit is far from being picked. I sometimes feel that people signing up for a CFAR (or Tony Robbins workshops, or whatever) believe that they are (metaphorically) already qualified to be UFC fighters and just need some kind of tuning to their mental game to win the championship. In reality, most people in this situation are likely not even above-average at punching and footwork.
Another way of saying this is that maybe your “mental game” is part of the problem, but probably the best way to improve your mental game is to improve your footwork while simultaneously establishing good self-talk. Approximately nobody ever thought themselves into being excellent at anything.
One of the many common Curses of Smart is being hypercompetent at a couple of tricks that you’ve adopted to be passably effective at life while being incredibly blind to your own limitations and bad habits. The Guild aims to bring all of a members’ crucial competencies up to “average.” To continue the martial arts metaphor, if you are not at least average in punching, takedowns, etc., then you probably have no business signing up for a cage fight. Further, you’re probably wasting your time paying money to learn the Secrets of Inner Energy Harmonics at a mountain retreat.
Anna mentions in this section that CFAR tried to use “AI risk” as a kind of tether to reality, a kind of pragmatic testing ground. In the Guild the pragmatic testing ground is just, whatever is going on in your life. Our testing ground is: what to do about a broken car door handle, how much to spend on a haircut, and how to pitch your voice in a Zoom meeting. Your Mandatory Secret Identity is your life and all the important stuff you’ve got going on. I hope it is obvious why this is a better reality tether.
Every cause wants to be a cult.
Early on in the game, we looked up the research on what exactly it is about cults that makes them cults, and then installed explicit policies to do the opposite of those things. For example, cults isolate people and encourage them to spend their energy on the cult. Part of the Guild rank-advancement structure encourages members to do service projects that are not connected to the Guild, e.g. service to some other physical or online group they consider themselves part of. Another example: cults are very demanding of members’ time, making it difficult for them to maintain separate personal lives. The Guild, as a matter of policy, asks no more than 3 hours per week of members’ time, and does not penalize absence. A final less serious example is that for important Council meetings or other serious Guild business, we wear silly hats, to mitigate against taking ourselves too seriously.
We also don’t really put much demand on members’ belief structure; you will find no mention of X-risk, AI alignment, or egregores in our course materials. I think this helps a bit with not being culty.
Also, “occult” literally means “hidden”, and our course materials are freely available and totally transparent, for what that’s worth.
I want to make a specific point here, because an uninformed observer might remark that the Guild looks more like a cult than CFAR does, because the Guild is a “group” of which you become a “member”, while CFAR is just a school that you attend briefly and then leave. But this broad kind of criticism actually seems to round off to an assertion that “groups/organizations are bad,” which I do not think most people would argue. In fact, I think “groups/organizations are (often) good” is the more supportable assertion!
People like being part of groups. Groups, with prosocial norms and requirements, are a public service provided to the commons; such groups motivate people to do good and useful things they probably would not otherwise have done. Being part of a group and identifying as a member motivates members to become better at whatever the group puts emphasis on. The Guild rank advancement system has demanding norms and requirements by design. Being a “person who attended a workshop” is not really being part of a group. Also, the Guild provides the Cohort structure, wherein each member is placed into a unit of ~6 people, with whom they primarily interact and work on course materials. Via the Cohorts, the Guild facilitates actual long-term friendship and community, which I would argue is an intrinsic good requiring no further justification beyond itself!
(As an aside, Guild members reliably remark that about half of the value of the Guild comes from the course content, and the other half comes from their relationships with their cohort. Make of this what you will.)
One takeaway from this post might be that the Guild of the Rose and CFAR are trying to do such different things that you have to wonder if we are even carrying the same torch. I think there is enough overlap regarding the crucial elements of our purpose that we probably are. The alien visitor with the rationality handbook would recognize both CFAR and the Guild of the Rose as worthy instantiations of the vision of Rationality Community, on the way to some shared terminus.
I ask at this point that you rely on the Virtue of Empiricism to validate my claims. First of all, our course material is all freely available and you can audit it at your whim. However, to give us a fair shake, I recommend that you join the Guild of the Rose via a 30-day free trial. I suggest this because the “active ingredient” of the course content is showing up and doing the exercises with your cohort, the small group of comrades you will be sorted into, and with which you will, very probably, become friends.