The Value (and Danger) of Ritual

This is the second part of my Winter Solstice Ritual mini-sequence. The introduction post is here.

-

Ritual is an interesting phenomenon to me. It can be fun, beautiful, profound, useful… and potentially dangerous.

Commenters from the previous article fell into two main camps—those who assumed I knew what I was doing and gave me the benefit of the doubt, and those who were afraid I was naively meddling with forces beyond my comprehension. This was a reasonable fear. In this article, I’ll outline why I think ritual is important, why it’s dangerous, why I think it’s relevant to an aspiring rationalist culture.

Before I start arguing how meaningful and transformative ritual can be, I want to argue something simpler:

It can be really fun.

This is not to be discounted. For whatever reason, humans tend to appreciate songs, stories and activities that they shared with their tribe. Hedons from ritual can take the form of fun joviality as well as intense, profound experiences.

Not everything we evolved to do is good. If we feel an urge to hit the enemy tribesman with a huge rock and take their land, we can and should say “No, there are important game theoretic and moral reasons why this is a bad idea” and suppress the urge. But we can also devise new activities, like knocking the enemy tribesman over and taking their ball, satisfying that urge without the negative consequences of war. I’d like access to the experience that ritual uniquely offers, if it can be done safely.

Ritual covers a range of experience. One subset of that is a kind of art. To give you some sense of what I mean here, here’s a few clusters of activities.

  • Art, enjoyed alone for simple aesthetics.

  • Art that speaks to your beliefs.

  • Art that you enjoy appreciating with other people.

  • Beliefs that you enjoy sharing with other people.

  • Repetition of activities that you do every year.

And here are a few songs I like:

Art and Belief

I like Silent Night because it is a simple, tranquil song, often sung with skillful harmonies.

I like Carol of the Bell because it is a powerful, awe-inspiring song that is performed with immense complexity and skill.

I like Do You Hear What I Hear partly for the same reasons I like Silent Night—it begins with simple tranquility. But I also like it for ideological reasons—it showcases the power of a meme growing over time, magnifying, evolving and changing the lives of its hosts as they come to believe it. As an artist hoping to craft powerful memes, this is very important to me. I also like the imagery of the proud king, willing to listen to the words of a shepherd boy, acknowledging the importance of a small infant born into poverty far away.

And the king is able to command the attention of an entire nation: Take a moment, stop what you are doing, and pay attention to this child.

But Do You Hear What I Hear also bothers me slightly—it lies in the uncanny valley of ideological identification. The song strikes very close to home in my heart, and I want to give myself over to the song, not just to sing the words but to truly feel them in my heart. And I can’t, because there is a slight snag when we get to the proud king. The king is valuing the child for all the wrong reasons. I want the child to be important because all children are important. But this king would not have given the child a second thought if it hadn’t been the son of God. I don’t believe in Jesus, so the intended message of the song clashes with what I want it to be about.

For the most part I sing the song without thinking about this, but that little snag is there, and it prevents the song from being one of my favorites ever.

Contrast this with Silent Night, where the message is largely irrelevant to me. Or Carol of the Bells, whose message is “Bells are pretty and people like them.” I appreciate them aesthetically and I respect skilled performers. Their messages don’t bother me, but neither do I feel as strongly about them.

Art and Tribe

The Word of God is beautiful because the world is an incredible place, and humans have discovered millions of beautiful true things about it. There is exactly one thing I dislike about this song, and it is not a disagreement with its ideology. It’s just the use of the word “God.” I don’t think it was wrong word to use—it’s a nice, simple word and I read it purely as a metaphor for “the universe.”

Like Do you Hear, there is some uncanny-valley effect here. But here it’s about tribal identification. (I draw a distinction between tribal identity and ideology—tribe is about identifying with a group of people, ideology is identifying with a belief-structure).

My mind snags because “God” is a word I normally associate with other cultures. This isn’t as big a deal as in Do You Hear. I don’t actually consider the goddists to be my enemy, I just don’t feel connected to them, and the word takes me out of the beauty of the song and reminds me of this disconnection. I did go ahead and include Word of God, verbatim, in the Winter Solstice Celebration. I just want to note that there are different reasons to be moved by (or fail to be moved by) a song.

[Edit in 2018: we’ve since re-written Word of God (after touching base with the original songwriter) to focus more purely on scientific progress rather than God. This was less because “God” was problematic and more because it kept the focus on political conflict that didn’t seem good for Solstice longterm]

Finally, we have Singularity, which I like for all kinds of reasons.

The music begins whimsical and fun, but grows more powerful and exciting over time. If you have good speakers, there’s a heavy but subtle baseline that drives the sound through your bones. It was refreshing to hear an unapologetic vision of how amazing the future could be. And when the sound abruptly cuts out and the song resets, there’s another image I really like—that humanity is not special because we were created by some God for a grand purpose. Instead, we are special precisely because we were shaped by random forces in an un-extraordinary corner of the universe, and all of our potential power and greatness comes from our own desires, intellect and drive.

So it’s ideologically moving to me. But I didn’t really realize until I sang in a group how tribally moving it could be. I wasn’t sure people would like the song. The chorus in particular sounds silly when you sing it by yourself. But as a group, everyone got really into it, and yes the chorus was still a little silly but we got up and waved our arms around and belted it out and it felt really good to be part of a group who believed that this weird, outlandish idea was in fact very plausible and important.

So that was cool.

I also thought it slightly terrifying.

Songs like Singularity are what give me the most pause about encouraging Less Wrong culture and rituals.

Signaling Issues

There are two big issues with ritual. The first is how it makes other people perceive us.

Rituals are, almost by definition, symbolic actions that look a little weird from the outside. They normally seem okay, because they are ancient and timeless (or at least were created a few years before people started paying attention). But any Less Wrong ritual is going to have all the normal weirdness of “fresh” tradition, and it’s going to look extra strange because we’re Less Wrong, and we’re going to be using words like “ritual” and “tribal identification” to matter-of-factly describe what we’re doing.

Some people may be turned off. Skeptics who specifically turned to rationality to escape mindless ritual that was forced upon them may find this all scary. Quality, intelligent individuals may come to our website, see an article about a night of ritual and then tune out and leave.

I think this is an acceptable cost to pay. Because for good or for ill, most humans like emotional things that aren’t strictly rational. Many people are drawn to the Sequences not just because they say important things, but because Eliezer crafted an emotional narrative around his ideas. He included litanies and parables, which move us in a way that pure logic often can’t.

There are smart cynics who will be turned off, but there are also smart idealists who will be drawn to recognizable human emotional arcs. I don’t think ritual should be the FIRST thing potential newcomers see, but I think it is something that will get them fully involved with our community and the important things we believe. I think it may particularly help former theists, who have built their entire lives around a community and ritual infrastructure, make the transition into atheists who are proud of their new beliefs and do productive things.

It may even help current theists make the transition, if they can see that they WON’T have to be giving up that community and ritual infrastructure, and all the hedons that went along with it.

But there’s another cost to ritual, that can’t be resolved quickly with a cost-benefit analysis.

[Update from 2016 - I want to clarify that while I think Less Wrong as a community is a reasonable place for ritual and culture-building, there are other related communities and organizations that are more “PR sensitive” and I don’t think should be connected to ritual]

Dangers of Reinforcement

Ritual taps into a lot of regions of our brain that are explicitly irrational, or at least a-rational. I don’t think we can afford to ignore those regions—they are too essential to our existence as humans to simply write off. We didn’t evolve to use pure logic to hold together communities and inspire decisions. Some people may be able to do this, but not most.

I think we need ritual, but I would be a fool to deny that we’re dealing with a dangerous force. Ritual is a self-reinforcing carrier wave for ideas. Those ideas can turn out to be wrong, and the art that was once beautiful and important can turn hollow or even dangerous. Even true ideas can be magnified until you ignite a happy death spiral, giving them far more of your time than they deserve.

Some of this can be mitigated by making traditions explicitly about the rational process, and building evaluation into the ritual itself. We can review individual elements and remove them if necessary. We can even plan to rewrite them into new parodies that refute the old idea, ceremoniously discarding our old ideas. But this will be a meaningless process unless we are putting in genuine effort—not just doing a dutiful review as good rationalists should.

We can recite the Litany of Tarski, but unless you are truly considering both possibilities and preparing yourself for them, the words are useless. No amount of pre-planning will change the fact that using ritual will require deliberate effort to protect you from the possibility of insanity.

You should be doing this anyway. There are plenty of ways to fall into a happy death spiral that don’t involve candle-lit gatherings and weird songs. When you’re dealing with ideas as powerful as the Singularity—a meme that provides a nice, sound-byte word that suggests a solution to all of the most terrible problems humanity faces—you should already be on guard against wishful thinking. When you’re talking about those ideas in a group, you should already be working hard—genuinely hard, not just performing a dutiful search—to overcome group think and evaporative cooling and maintain your objectivity.

You should be doing that no matter what. Every cause wants to be a cult, whether or not they have a nice word that sounds way simpler than it actually is that promises to solve all the world’s problems.

Ritual does make this harder. I’m particularly wary of songs like Singularity, which build up a particular idea that still has a LOT of unknown factors. An anonymous commenter from the Solstice celebration told me they were concerned about the song because it felt like they were “worshipping” the Singularity, and I agree, this is concerning, both for our own sanity and for the signaling it implies to newcomers who stumble upon this discussion.

I’d go ahead and exclude the song, and any meme that got too specific with too many uncertainties.… except that a lot of our most powerful, beautiful images come from specific ideas about the future. A generic rallying cry of “Science!”, “Humanism!” or “Rationality!” is not a satisfying answer to the problems of Death and Global Suffering and Existential Risk. It’s not satisfying on an artistic level, an intellectual level or a tribal level. Having specific ideas about how to steer the future is what gives our group a unique identity. Caring too much about that identity is dangerous, but it can also be extremely motivational.

As of now, I’m not sure what I think about this particular problem. I look forward to commenters weighing in.

With all this dire warning, it may seem like a slam dunk case, to abandon the idea of ritual. Obviously, I disagree, for a few reasons.

The first is that honestly, gathering a few times a year to sing “Singularity! Singularity!”, even without all the preventative measures, simply pales in significance compared to… well… the entire Less Wrong community-memeplex doing what it does on a regular basis.

If we were genuinely concerned about making bad decisions due to reinforcement rituals, I’d start by worrying about much more mundane rituals, like continuously discussing the Singularity all the time. Constantly talking about an idea trains your brain to think of it as important. Hanging out on forums with a constant stream of news about it creates confirmation and availability bias. If you’re concerned about irrationality, as opposed to weird ceremonies that might signal low status, you should already be putting a lot of effort into protecting yourself against a happy death spiral, and the extra effort you need to expend for a few nights of jubilant celebration shouldn’t be that significant.

The danger of ceremonial ritual in particular is real, but overestimating it isn’t much better than underestimating it. Even if we’re just talking about ritual as a source of hedons that we’ve previously denied ourselves. Families across the world gather to sing songs about ideas they like, and while this may be a human behavior we need to sacrifice, I’m not going to do so out of fear of what *might* happen without a decent understanding of why.

But there’s more to it than that. And this is why I’ve worked so hard on this, and why I think the potential upsides are so important.

Aspiring Rationalist Culture

I had two major motivations for the Solstice celebration. One of them was to produce a fun event for my community, and to inspire similar events for people across the world who share my memes.

The other was personal: Rationality training has made me better at identifying good solutions, but it hasn’t made those solutions emotionally salient. This is particularly important when it comes to optimal philanthropy—a million starving people across the world simply can’t compete with a single smiling orphan I get to personally deliver a christmas present to. And those people have an even harder time if they live in the distant future.

Scope insensitivity and time discounting can be hard to overcome. Worst of all is when the best solution might not work, and I may not even live to see it work, and I can never get the emotional satisfaction of knowing I actually helped anyone at all.

I constructed the Solstice celebration around a narrative, based around the interplay between past, present and future. The process of crafting that narrative was extremely valuable to me, and has helped me to finally give Existential Risk the weight it deserves. I haven’t committed to helping SIAI in particular, but I feel like I’m at place where if I got better information on how effective SIAI is, I’d be emotionally able to act on that information.

I don’t think the first execution of the Solstice celebration successfully provided other people with that experience, but I think there is tremendous potential in the idea. I’d like to see the development of a culture that doesn’t glorify any particular solution, but which takes important rationality concepts and helps people modify their emotions match the actual expected values of actions that would otherwise seem cold, hard and calculating.

I think this may turn out to be very important.

In some ways this has me far more scared than ritual-as-hedons. People can gather for a night of jovial fun and come away mostly unchanged. Using ritual *deliberately* to modify yourself is risky, and it is perhaps riskiest if you think you have a good reason for doing so.

I don’t think this is that dangerous on the individual level. It was useful to me, I think others may find value in it. Actually allowing yourself to be changed in a meaningful way requires effort beyond an initial, inspiring ceremony. (It took me several weeks of intense work *preparing* the Solstice celebration, cemented by a final moment when a few ideas clicked into place and I came up with a metaphor I could use to alter my emotions. I don’t know for sure whether this can be distilled into a process that others can use with less effort).

Next year, I expect the people who come to Solstice for the comfort and hedons will get what they need, and if anyone wants to use it as a springboard for self-modification, they will be able to as well.

The possibility that most concerns me is a chain of events going something like this:

  1. Someone (possibly a future version of me, possibly any random person perusing these articles) will decide that this is important enough to deliberately propagate on a mass scale.

  2. Said person will become good enough at ritual-craft (or use additional dark arts techniques) to make this much more effective than I currently anticipate.

  3. The result is an effective but low-status self-propagating organization that ends up corrupting the original goal of “help people be better at following through on correct but emotionally unsatisfying choices.”

This scenario requires someone to put a lot of work in, and they would probably fail uneventfully even if they did. Even if events transpire this way, the problem is less that a hollow self-propagating memeplex exists (it’s not like we don’t already have plenty of them, one more won’t hurt that much) but that its association with Less Wrong and related things may tarnish our reputation.

I’d like to think this is a bad thing, although truth be told I think it assumes a level of importance that Less Wrong hasn’t really carved or itself yet in the greater world. But we are trying to gain status and out of respect for the community I should acknowledge this risk, and sincerely solicit feedback.

My current assessment is that a) this is unlikely, and b) any organization that’s trying to accomplish things on a large scale WILL have to accept the risk that it transforms into a hollow, self perpetuating memeplex. If you don’t want that risk at all, you’re probably not going to affect the world in a noticeable way. Ritual-driven memeplexes tend to be religions, which many of us consider a rival tribe, so they carry more emotional weight in our risk assessment. But this can also happen to corporations, unions, non-profits and political movements that may have been genuinely valuable at some point.

I do plan to study this issue in more detail over the coming year. If anyone does have specific literature or examples that I should be aware of, I’d appreciate it. But my priors are that the few negative reactions I’ve gotten to this are based more out of emotion than a clear understanding of the risks.

My final concern is that this simply isn’t a topic that Less Wrong should encourage that much of, partly because some people find it annoying and partly because we really should be spending our time developing tools for rational thinking and studying scientific literature. I have another article or two that I think would be valuable enough for the main page, and after that I’ll be inviting people to a separate mailing list if they want to collaborate.

This is the second post of the ritual mini-sequence. The next post is Designing Ritual.