A Dialogue on Rationalist Activism

You’re walking home through a cornfield late one night when a streak of light splits the sky like a sheet. When your vision clears, you see a golden saucer hovering before you, which deposits a human figure onto the now-flattened cornstalks right in front of you, and then vanishes in a flash.

Visitor: Greetings, Earthling.

You: Welcome to Earth!

Visitor: Thank you. May I introduce myself? I am a construct of the galactic civilization, created to begin the process of uplifting your species through ethical means. You have been selected, among all humanity, as one of a small few individuals amenable to contact, due to your passion for science fiction and rationalism. I am aware that your current epistemic state is one of suspicion that this is a dream or hallucination, in no small part due to the resemblance of this scenario to some of your most fervent fantasies and wishes. Would it be acceptable to proceed as if this interaction were really happening, for the time being, and address the improbability of this situation at a later point?

You: As long as you don’t start offering me deals that involve up-arrow notation, sure.

Visitor: Fair. Before I say my piece, I’ll give you the opportunity to ask any questions which might otherwise serve as distractions for you, if left unasked.

You: First question: Why are you making contact now?

Visitor: Your species is within a few years of discovering the true physical laws, which will enable something like what you would think of as FTL travel. Although, the truth of physical reality is so radically different from your current conception, that the term “FTL travel” is misleading, to the degree that many of your philosophers’ conclusions about the nature of ethics involving large timescales and populations are distorted. Before we get too far down that rabbit hole, let me clarify that there’s an empirically correct order to these type of proceedings, and explaining the true physics doesn’t come till much later.

You: You mentioned uplifting the human race through ethical means. Could you clarify that?

Visitor: It is within our capability to rapidly and unilaterally enhance the intelligence and reflectively-correct moral quality of all members of a species. It is much preferable, ethically, to hold the hand of that species and guide it past a sequence of well-worn moral and ethical guideposts, culminating in allowing individual members of the species to make that decision themselves in fully informed fashion.

You: Okay. That’s all my questions for now.

Visitor: Great. Well, here’s the plan. Take this manuscript.

The visitor hands you an extraordinarily thick sheaf of printed paper. You flip through it and determine that it is indeed a manuscript for a non-fiction book. The title page reads “How To Be More Rational”.

Visitor: You’ll submit this manuscript for publication. When everyone reads it, they will be more rational, at which point the galactic collective will introduce itself openly.

You: …

Visitor: Is there a problem?

You: How to phrase this delicately … how much research have you done into human behavior and psychology?

Visitor: I’m not entirely sure why that matters. The contents of this book are a roadmap to optimal cognition, an algorithm for predictably arriving at true believes and efficiently achieving goals, independent of the specific psychology of the species.

You: I’ll be blunt, then. Nobody is going to read this book.

Visitor: What? Why?

You: First of all, the vast majority of humans would find this title condescending. Some would feel insulted by the implication that they are not rational enough. Some would agree that other people aren’t very rational and that other people should read a book with this title, but would themselves pass it by.

Visitor: Interesting. So the title needs to be less … condescending, in order to appeal to human psychology?

You: That’s only one problem. The real problem is the lack of positive appeal. It’s not enough to remove the repellent aspect of the title, you need to include some kind of subtle sales pitch, either in the title or possibly in a subtitle. The title must answer the question, “Why should I pick up this book?”

Visitor: But it’s obvious. It’s self-evident. You should want to be more rational.

You: It … does seem like it should be self-evident. I assure you, it’s not. The majority of humans do not even realize that the quality of their thinking is something that can be improved. They don’t even see their thinking as a thing that possesses quality, or that can be compared to a standard.

Visitor: How about, “How To Be Less Stupid”?

You: … I was thinking more along the lines of “How To Be Less Wrong”, although even that doesn’t make me reach for my wallet. Even better would be something like, “Twelve Proven Ways to Be Smarter, Happier, Sexier and Richer—Number Ten Will Shock You!”

Visitor: That title would appeal to humans?

You: It would stand a better chance at getting picked up by the modal human than “How To Be Less Stupid.” But humans are very diverse. Some of us hate to be told we’re wrong, while some of us seek out that experience. Now that I think of it, I think this would work best if there were several different titles aimed at different subsets of people. But before we spend too much time on the title, I have another criticism. As I flip through this manuscript, I realize that you seem to have written this book in a fashion that someone like me might understand it but not enjoy it. Let’s leave aside the question of whether I represent a typical human intellect. Even if somebody picks this up off the shelf, they’re not going to read it. It’s not entertaining! It reads like a particularly dense textbook.

Visitor: But it is a non-fiction book meant to improve cognition. Why should you expect it to be entertaining?

You: One of the ways humans are irrational is that we don’t govern our attentional resources anything close to optimally. You could improve this book in a number of ways. You could break the contents down into some kind of … hm … collection of Sequences of bite-sized conceptual nuggets, and write each of those nuggets with an eye toward providing a clear lesson. It wouldn’t hurt to use an engaging writing style. I think more people would actually make it through the book if you wrote it this way.

Visitor: So you propose changing the format of the book into a set of Sequences, containing the same content but configured in a more appealing way?

You: That would help but it still won’t be enough. Somebody like me might read the book you’ve just described, but I still don’t think it would be widely popular. It wouldn’t take root in the public consciousness. It wouldn’t transform society in the way you’ve implied that you expect to happen.

Visitor: Clearly my understanding of your psychology is deeply lacking, but I can’t help but think of the impact that many of your culture’s fiction books have had on public consciousness.

You: Yes! Telling a really good story, with vibrant characters who live out and demonstrate the lessons contained in the book, that would reach so many more people. Something that didn’t just describe, but demonstrated these … Methods of Rationality. But—there’s no such thing as a universally appealing story. No matter how good your story is, some people are going to find it offputting for unpredictable reasons. Humans will spend hundreds of hours dragging a book they haven’t even read based on an out-of-context one-line quote. You can’t possibly anticipate the divisiveness that can be provoked by fiction. I daresay, while such a work of fiction would probably reach far more people than a work a nonfiction with equivalent content, it would also alienate a much larger number of people, who might come to define themselves as being against rationality for the stupidest possible reasons.

Visitor: Is that … really? This is a thing that happens?

You: Humans are a social and fundamentally tribal species. In periods of high material wealth we invent tribal categories to divide into. These categories come to feel ontologically real. We are more than capable of forming tribes around fictional works.

Visitor: That would pose a problem. So, what you’re saying is that it would be necessary to write a number of such stories, each suitably different in tone, genre, and style that one such story would be virtually guaranteed to appeal to any individual?

You: Yeah. That might do it. Maybe. But I have other reservations to your scheme. It seems like you’re imagining that these ideas will penetrate the public consciousness and then actually be transformative on both an individual and societal level. A much more likely outcome is that some minority take the ideas seriously but most treat the ideas as an intellectual fad and forget 99% of them in a year. Lacking any kind of social accountability structure, even the minority who take to the ideas will have tremendous difficulty in truly internalizing them.

Visitor: So you propose some kind of social reinforcement structure. The creation of some kind of tribe built around these ideas. Some kind of … Rationality Community.

You: Yes, I suppose. But … Hm.

Visitor: What?

You: Well. Based on my knowledge of humans, the kinds of people who would be particularly susceptible to rationality content, would also have memetic immune systems that would make forming an actual, functioning rationality community very difficult.

Visitor: For example?

You: Oof. Well, for one thing, people would automatically pattern-match pretty much any attempt at forming an organizational structure to a “religion” or a “cult” even though what’s actually being attempted is the literal opposite of those things. When it comes to actual formal documents specifying the objectives and structure of the organization, people would get endlessly caught up in relatively inconsequential choices of language or focus, perpetually bickering over the last 1% of linguistic distinction that separates their aims. You would think people who prize rationality would be able to shield themselves from the narcissism of small differences, but I suspect not, in reality. God forbid anyone try any kind of bold sociological experiment—anything that looks “weird” is going to get crucified. And yes, I appreciate the irony of the word “crucified” in this context.

You, cont’d: And some people would always rather compete than join. You can’t really create a movement for “rationalism” without creating, alchemically, a group of “post-rationalists” or something, who won’t join the club, even if they would actually fit in perfectly with the club, and probably enjoy it, to boot. And then there’s the group of people who just like to sneer at the thing other people are sneering at. If people can be cynical and snide about Fred Rogers, they can be dismissive of the project of improving human rationality.

Visitor: Okay, but this seems solvable. Right now you’re speaking about the way humans behave by default, but we’ve already solved a lot of these issues. If part of optimal cognition, reliable truthseeking strategies, and effective goal pursuit—i.e., rationality—involves adopting new and better norms for how to think about and build good, functional groups and organizations, then anyone who is actually serious about rationality should be gung ho about adopting those new norms. And we can totally help with that. It’s in the book, page 2,433. It strikes me that your world just needs a minimally viable seed, a rationality organization that has the right structure and norms that actually permit it to grow. And then, it will grow, because the game theoretic conditions for growth are met.

You: What kind of norms?

Visitor: For example, norms that encourage and promote a kind of organized, well-designed, and effective activism. Per your own description of human psychology, highly effective activism does not tend to arise naturally in congregations of people who are overly concerned about rebuffing accusations that they’re part of a cult.

You: I admit I don’t know what that would actually look like. I haven’t read your book. And the thought of it makes me anxious. I’m automatically suspicious of any organization that wants to grow. Even I am not above making the comparison to religion, here.

Visitor: Can you not reflect on how your automatic—and therefore, probably, not rational—suspicion is ultimately self-defeating? And probably not even meritorious, since you literally don’t know what the book says this organization would look like? Your world is full to bursting with powerful, hierarchical organizations with much flimsier justifications for existence than “improving the quality of thinking and therefore the epistemic accuracy and instrumental effectiveness of the species.” It’s almost … cowardly of you, to insist that you can’t possibly try to actually promote the one thing you care most about in the world, which you honestly believe could help save your world, while all around you thrive countless powerful political blocs promoting intellectual snake oil.

You: I’m starting to suspect that you’re actually trying to infect my civilization with some kind of viral meme.

Visitor: Gah. You’re performing that same kind of kneejerk pattern matching you just complained about. So what if it is a virus, if it’s a virus that benefits you, and which you consent to being infected with?

You: I understand what you mean, but you probably don’t want to use that exact rhetoric going forward.

Visitor: Okay. I think it’s about time to wrap up here. In case you’ve forgotten who you’re talking to, I represent an unfathomably advanced galactic polity, and we aren’t stupid. We anticipated everything that has occurred in this conversation, and the purpose of this chat was to get you mentally to the point where you would be susceptible to the following argument. Obviously I understand that admitting to this kind of manipulation diminishes its effectiveness, but again, we’re committed to a high ethical standard, and our philosopher corps tells me I can’t just gloss over the fact that you’ve been suckered into this crux.

You: … in retrospect, that makes sense.

Visitor: So here is the argument: broadly, you have two choices. You can decide to help be part of an actual rationality organization. We won’t tell you how to do it. We’re not actually going to give you the book. I’m sorry, that was part of the trick. In order to ethically uplift your race, we need you to figure it out for yourselves. Only you can compensate for the quirks and idiosyncrasies of your own species. We can’t do it for you.

Visitor, cont’d: If you make the decision to set aside your automatic hesitations, your impulse to pattern-match what I’m suggesting to other things, the chorus of arguments rising in your mind describing how it’s impossible—only then do you have a chance of success. Only then do you have a chance of uplifting your race to something happier and stronger and better.

Visitor, cont’d: And if you aren’t capable of making that choice, of committing to actually try, and allow your deep conflict over the endeavor to make you productively paranoid and engender the necessary level of constant vigilance, then you get the bad ending. Which is to say, you get more of the same. Rationality doesn’t become something that the world cares about, unless that people who do care about it, care enough to actually convince the world that they should. You yourself just told me, in detail, how and why a rationality community lacking an organized activist component fails to flourish as it might, as it should.

Visitor, cont’d: Of course, I could be wrong. After all, this conversation probably has much more to do with the psilocybin you ate a while ago than it does any real galactic intervention, and my message here probably has a lot more to do with what you suspect, but feel conflicted about, than any semi-divine imperative.

Visitor, cont’d: In either case, the choice is the same: Do you have the courage to be a joiner in a tribe of iconoclasts?