A Dialogue on Rationalist Activism

You’re walk­ing home through a corn­field late one night when a streak of light splits the sky like a sheet. When your vi­sion clears, you see a golden saucer hov­er­ing be­fore you, which de­posits a hu­man figure onto the now-flat­tened corn­stalks right in front of you, and then van­ishes in a flash.

Visi­tor: Greet­ings, Earth­ling.

You: Wel­come to Earth!

Visi­tor: Thank you. May I in­tro­duce my­self? I am a con­struct of the galac­tic civ­i­liza­tion, cre­ated to be­gin the pro­cess of up­lift­ing your species through eth­i­cal means. You have been se­lected, among all hu­man­ity, as one of a small few in­di­vi­d­u­als amenable to con­tact, due to your pas­sion for sci­ence fic­tion and ra­tio­nal­ism. I am aware that your cur­rent epistemic state is one of sus­pi­cion that this is a dream or hal­lu­ci­na­tion, in no small part due to the re­sem­blance of this sce­nario to some of your most fer­vent fan­tasies and wishes. Would it be ac­cept­able to pro­ceed as if this in­ter­ac­tion were re­ally hap­pen­ing, for the time be­ing, and ad­dress the im­prob­a­bil­ity of this situ­a­tion at a later point?

You: As long as you don’t start offer­ing me deals that in­volve up-ar­row no­ta­tion, sure.

Visi­tor: Fair. Be­fore I say my piece, I’ll give you the op­por­tu­nity to ask any ques­tions which might oth­er­wise serve as dis­trac­tions for you, if left unasked.

You: First ques­tion: Why are you mak­ing con­tact now?

Visi­tor: Your species is within a few years of dis­cov­er­ing the true phys­i­cal laws, which will en­able some­thing like what you would think of as FTL travel. Although, the truth of phys­i­cal re­al­ity is so rad­i­cally differ­ent from your cur­rent con­cep­tion, that the term “FTL travel” is mis­lead­ing, to the de­gree that many of your philoso­phers’ con­clu­sions about the na­ture of ethics in­volv­ing large timescales and pop­u­la­tions are dis­torted. Be­fore we get too far down that rab­bit hole, let me clar­ify that there’s an em­piri­cally cor­rect or­der to these type of pro­ceed­ings, and ex­plain­ing the true physics doesn’t come till much later.

You: You men­tioned up­lift­ing the hu­man race through eth­i­cal means. Could you clar­ify that?

Visi­tor: It is within our ca­pa­bil­ity to rapidly and unilat­er­ally en­hance the in­tel­li­gence and re­flec­tively-cor­rect moral qual­ity of all mem­bers of a species. It is much prefer­able, eth­i­cally, to hold the hand of that species and guide it past a se­quence of well-worn moral and eth­i­cal guide­posts, cul­mi­nat­ing in al­low­ing in­di­vi­d­ual mem­bers of the species to make that de­ci­sion them­selves in fully in­formed fash­ion.

You: Okay. That’s all my ques­tions for now.

Visi­tor: Great. Well, here’s the plan. Take this manuscript.

The vis­i­tor hands you an ex­traor­di­nar­ily thick sheaf of printed pa­per. You flip through it and de­ter­mine that it is in­deed a manuscript for a non-fic­tion book. The ti­tle page reads “How To Be More Ra­tional”.

Visi­tor: You’ll sub­mit this manuscript for pub­li­ca­tion. When ev­ery­one reads it, they will be more ra­tio­nal, at which point the galac­tic col­lec­tive will in­tro­duce it­self openly.

You: …

Visi­tor: Is there a prob­lem?

You: How to phrase this del­i­cately … how much re­search have you done into hu­man be­hav­ior and psy­chol­ogy?

Visi­tor: I’m not en­tirely sure why that mat­ters. The con­tents of this book are a roadmap to op­ti­mal cog­ni­tion, an al­gorithm for pre­dictably ar­riv­ing at true be­lieves and effi­ciently achiev­ing goals, in­de­pen­dent of the spe­cific psy­chol­ogy of the species.

You: I’ll be blunt, then. No­body is go­ing to read this book.

Visi­tor: What? Why?

You: First of all, the vast ma­jor­ity of hu­mans would find this ti­tle con­de­scend­ing. Some would feel in­sulted by the im­pli­ca­tion that they are not ra­tio­nal enough. Some would agree that other peo­ple aren’t very ra­tio­nal and that other peo­ple should read a book with this ti­tle, but would them­selves pass it by.

Visi­tor: In­ter­est­ing. So the ti­tle needs to be less … con­de­scend­ing, in or­der to ap­peal to hu­man psy­chol­ogy?

You: That’s only one prob­lem. The real prob­lem is the lack of pos­i­tive ap­peal. It’s not enough to re­move the re­pel­lent as­pect of the ti­tle, you need to in­clude some kind of sub­tle sales pitch, ei­ther in the ti­tle or pos­si­bly in a sub­ti­tle. The ti­tle must an­swer the ques­tion, “Why should I pick up this book?”

Visi­tor: But it’s ob­vi­ous. It’s self-ev­i­dent. You should want to be more ra­tio­nal.

You: It … does seem like it should be self-ev­i­dent. I as­sure you, it’s not. The ma­jor­ity of hu­mans do not even re­al­ize that the qual­ity of their think­ing is some­thing that can be im­proved. They don’t even see their think­ing as a thing that pos­sesses qual­ity, or that can be com­pared to a stan­dard.

Visi­tor: How about, “How To Be Less Stupid”?

You: … I was think­ing more along the lines of “How To Be Less Wrong”, al­though even that doesn’t make me reach for my wallet. Even bet­ter would be some­thing like, “Twelve Proven Ways to Be Smarter, Hap­pier, Sex­ier and Richer—Num­ber Ten Will Shock You!”

Visi­tor: That ti­tle would ap­peal to hu­mans?

You: It would stand a bet­ter chance at get­ting picked up by the modal hu­man than “How To Be Less Stupid.” But hu­mans are very di­verse. Some of us hate to be told we’re wrong, while some of us seek out that ex­pe­rience. Now that I think of it, I think this would work best if there were sev­eral differ­ent ti­tles aimed at differ­ent sub­sets of peo­ple. But be­fore we spend too much time on the ti­tle, I have an­other crit­i­cism. As I flip through this manuscript, I re­al­ize that you seem to have writ­ten this book in a fash­ion that some­one like me might un­der­stand it but not en­joy it. Let’s leave aside the ques­tion of whether I rep­re­sent a typ­i­cal hu­man in­tel­lect. Even if some­body picks this up off the shelf, they’re not go­ing to read it. It’s not en­ter­tain­ing! It reads like a par­tic­u­larly dense text­book.

Visi­tor: But it is a non-fic­tion book meant to im­prove cog­ni­tion. Why should you ex­pect it to be en­ter­tain­ing?

You: One of the ways hu­mans are ir­ra­tional is that we don’t gov­ern our at­ten­tional re­sources any­thing close to op­ti­mally. You could im­prove this book in a num­ber of ways. You could break the con­tents down into some kind of … hm … col­lec­tion of Se­quences of bite-sized con­cep­tual nuggets, and write each of those nuggets with an eye to­ward pro­vid­ing a clear les­son. It wouldn’t hurt to use an en­gag­ing writ­ing style. I think more peo­ple would ac­tu­ally make it through the book if you wrote it this way.

Visi­tor: So you pro­pose chang­ing the for­mat of the book into a set of Se­quences, con­tain­ing the same con­tent but con­figured in a more ap­peal­ing way?

You: That would help but it still won’t be enough. Some­body like me might read the book you’ve just de­scribed, but I still don’t think it would be widely pop­u­lar. It wouldn’t take root in the pub­lic con­scious­ness. It wouldn’t trans­form so­ciety in the way you’ve im­plied that you ex­pect to hap­pen.

Visi­tor: Clearly my un­der­stand­ing of your psy­chol­ogy is deeply lack­ing, but I can’t help but think of the im­pact that many of your cul­ture’s fic­tion books have had on pub­lic con­scious­ness.

You: Yes! Tel­ling a re­ally good story, with vibrant char­ac­ters who live out and demon­strate the les­sons con­tained in the book, that would reach so many more peo­ple. Some­thing that didn’t just de­scribe, but demon­strated these … Meth­ods of Ra­tion­al­ity. But—there’s no such thing as a uni­ver­sally ap­peal­ing story. No mat­ter how good your story is, some peo­ple are go­ing to find it offputting for un­pre­dictable rea­sons. Hu­mans will spend hun­dreds of hours drag­ging a book they haven’t even read based on an out-of-con­text one-line quote. You can’t pos­si­bly an­ti­ci­pate the di­vi­sive­ness that can be pro­voked by fic­tion. I dare­say, while such a work of fic­tion would prob­a­bly reach far more peo­ple than a work a non­fic­tion with equiv­a­lent con­tent, it would also alienate a much larger num­ber of peo­ple, who might come to define them­selves as be­ing against ra­tio­nal­ity for the stupi­dest pos­si­ble rea­sons.

Visi­tor: Is that … re­ally? This is a thing that hap­pens?

You: Hu­mans are a so­cial and fun­da­men­tally tribal species. In pe­ri­ods of high ma­te­rial wealth we in­vent tribal cat­e­gories to di­vide into. Th­ese cat­e­gories come to feel on­tolog­i­cally real. We are more than ca­pa­ble of form­ing tribes around fic­tional works.

Visi­tor: That would pose a prob­lem. So, what you’re say­ing is that it would be nec­es­sary to write a num­ber of such sto­ries, each suit­ably differ­ent in tone, genre, and style that one such story would be vir­tu­ally guaran­teed to ap­peal to any in­di­vi­d­ual?

You: Yeah. That might do it. Maybe. But I have other reser­va­tions to your scheme. It seems like you’re imag­in­ing that these ideas will pen­e­trate the pub­lic con­scious­ness and then ac­tu­ally be trans­for­ma­tive on both an in­di­vi­d­ual and so­cietal level. A much more likely out­come is that some minor­ity take the ideas se­ri­ously but most treat the ideas as an in­tel­lec­tual fad and for­get 99% of them in a year. Lack­ing any kind of so­cial ac­countabil­ity struc­ture, even the minor­ity who take to the ideas will have tremen­dous difficulty in truly in­ter­nal­iz­ing them.

Visi­tor: So you pro­pose some kind of so­cial re­in­force­ment struc­ture. The cre­ation of some kind of tribe built around these ideas. Some kind of … Ra­tion­al­ity Com­mu­nity.

You: Yes, I sup­pose. But … Hm.

Visi­tor: What?

You: Well. Based on my knowl­edge of hu­mans, the kinds of peo­ple who would be par­tic­u­larly sus­cep­ti­ble to ra­tio­nal­ity con­tent, would also have memetic im­mune sys­tems that would make form­ing an ac­tual, func­tion­ing ra­tio­nal­ity com­mu­nity very difficult.

Visi­tor: For ex­am­ple?

You: Oof. Well, for one thing, peo­ple would au­to­mat­i­cally pat­tern-match pretty much any at­tempt at form­ing an or­ga­ni­za­tional struc­ture to a “re­li­gion” or a “cult” even though what’s ac­tu­ally be­ing at­tempted is the literal op­po­site of those things. When it comes to ac­tual for­mal doc­u­ments spec­i­fy­ing the ob­jec­tives and struc­ture of the or­ga­ni­za­tion, peo­ple would get end­lessly caught up in rel­a­tively in­con­se­quen­tial choices of lan­guage or fo­cus, per­pet­u­ally bick­er­ing over the last 1% of lin­guis­tic dis­tinc­tion that sep­a­rates their aims. You would think peo­ple who prize ra­tio­nal­ity would be able to shield them­selves from the nar­cis­sism of small differ­ences, but I sus­pect not, in re­al­ity. God for­bid any­one try any kind of bold so­ciolog­i­cal ex­per­i­ment—any­thing that looks “weird” is go­ing to get cru­ci­fied. And yes, I ap­pre­ci­ate the irony of the word “cru­ci­fied” in this con­text.

You, cont’d: And some peo­ple would always rather com­pete than join. You can’t re­ally cre­ate a move­ment for “ra­tio­nal­ism” with­out cre­at­ing, al­chem­i­cally, a group of “post-ra­tio­nal­ists” or some­thing, who won’t join the club, even if they would ac­tu­ally fit in perfectly with the club, and prob­a­bly en­joy it, to boot. And then there’s the group of peo­ple who just like to sneer at the thing other peo­ple are sneer­ing at. If peo­ple can be cyn­i­cal and snide about Fred Rogers, they can be dis­mis­sive of the pro­ject of im­prov­ing hu­man ra­tio­nal­ity.

Visi­tor: Okay, but this seems solv­able. Right now you’re speak­ing about the way hu­mans be­have by de­fault, but we’ve already solved a lot of these is­sues. If part of op­ti­mal cog­ni­tion, re­li­able truth­seek­ing strate­gies, and effec­tive goal pur­suit—i.e., ra­tio­nal­ity—in­volves adopt­ing new and bet­ter norms for how to think about and build good, func­tional groups and or­ga­ni­za­tions, then any­one who is ac­tu­ally se­ri­ous about ra­tio­nal­ity should be gung ho about adopt­ing those new norms. And we can to­tally help with that. It’s in the book, page 2,433. It strikes me that your world just needs a min­i­mally vi­able seed, a ra­tio­nal­ity or­ga­ni­za­tion that has the right struc­ture and norms that ac­tu­ally per­mit it to grow. And then, it will grow, be­cause the game the­o­retic con­di­tions for growth are met.

You: What kind of norms?

Visi­tor: For ex­am­ple, norms that en­courage and pro­mote a kind of or­ga­nized, well-de­signed, and effec­tive ac­tivism. Per your own de­scrip­tion of hu­man psy­chol­ogy, highly effec­tive ac­tivism does not tend to arise nat­u­rally in con­gre­ga­tions of peo­ple who are overly con­cerned about re­buf­fing ac­cu­sa­tions that they’re part of a cult.

You: I ad­mit I don’t know what that would ac­tu­ally look like. I haven’t read your book. And the thought of it makes me anx­ious. I’m au­to­mat­i­cally sus­pi­cious of any or­ga­ni­za­tion that wants to grow. Even I am not above mak­ing the com­par­i­son to re­li­gion, here.

Visi­tor: Can you not re­flect on how your au­to­matic—and there­for, prob­a­bly, not ra­tio­nal—sus­pi­cion is ul­ti­mately self-defeat­ing? And prob­a­bly not even mer­i­to­ri­ous, since you liter­ally don’t know what the book says this or­ga­ni­za­tion would look like? Your world is full to burst­ing with pow­er­ful, hi­er­ar­chi­cal or­ga­ni­za­tions with much flim­sier jus­tifi­ca­tions for ex­is­tence than “im­prov­ing the qual­ity of think­ing and there­for the epistemic ac­cu­racy and in­stru­men­tal effec­tive­ness of the species.” It’s al­most … cow­ardly of you, to in­sist that you can’t pos­si­bly try to ac­tu­ally pro­mote the one thing you care most about in the world, which you hon­estly be­lieve could help save your world, while all around you thrive countless pow­er­ful poli­ti­cal blocs pro­mot­ing in­tel­lec­tual snake oil.

You: I’m start­ing to sus­pect that you’re ac­tu­ally try­ing to in­fect my civ­i­liza­tion with some kind of viral meme.

Visi­tor: Gah. You’re perform­ing that same kind of knee­jerk pat­tern match­ing you just com­plained about. So what if it is a virus, if it’s a virus that benefits you, and which you con­sent to be­ing in­fected with?

You: I un­der­stand what you mean, but you prob­a­bly don’t want to use that ex­act rhetoric go­ing for­ward.

Visi­tor: Okay. I think it’s about time to wrap up here. In case you’ve for­got­ten who you’re talk­ing to, I rep­re­sent an un­fath­omably ad­vanced galac­tic polity, and we aren’t stupid. We an­ti­ci­pated ev­ery­thing that has oc­curred in this con­ver­sa­tion, and the pur­pose of this chat was to get you men­tally to the point where you would be sus­cep­ti­ble to the fol­low­ing ar­gu­ment. Ob­vi­ously I un­der­stand that ad­mit­ting to this kind of ma­nipu­la­tion diminishes its effec­tive­ness, but again, we’re com­mit­ted to a high eth­i­cal stan­dard, and our philoso­pher corps tells me I can’t just gloss over the fact that you’ve been suck­ered into this crux.

You: … in ret­ro­spect, that makes sense.

Visi­tor: So here is the ar­gu­ment: broadly, you have two choices. You can de­cide to help be part of an ac­tual ra­tio­nal­ity or­ga­ni­za­tion. We won’t tell you how to do it. We’re not ac­tu­ally go­ing to give you the book. I’m sorry, that was part of the trick. In or­der to eth­i­cally up­lift your race, we need you to figure it out for your­selves. Only you can com­pen­sate for the quirks and idiosyn­crasies of your own species. We can’t do it for you.

Visi­tor, cont’d: If you make the de­ci­sion to set aside your au­to­matic hes­i­ta­tions, your im­pulse to pat­tern-match what I’m sug­gest­ing to other things, the cho­rus of ar­gu­ments ris­ing in your mind de­scribing how it’s im­pos­si­ble—only then do you have a chance of suc­cess. Only then do you have a chance of up­lift­ing your race to some­thing hap­pier and stronger and bet­ter.

Visi­tor, cont’d: And if you aren’t ca­pa­ble of mak­ing that choice, of com­mit­ting to ac­tu­ally try, and al­low your deep con­flict over the en­deavor to make you pro­duc­tively para­noid and en­gen­der the nec­es­sary level of con­stant vigilance, then you get the bad end­ing. Which is to say, you get more of the same. Ra­tion­al­ity doesn’t be­come some­thing that the world cares about, un­less that peo­ple who do care about it, care enough to ac­tu­ally con­vince the world that they should. You your­self just told me, in de­tail, how and why a ra­tio­nal­ity com­mu­nity lack­ing an or­ga­nized ac­tivist com­po­nent fails to flour­ish as it might, as it should.

Visi­tor, cont’d: Of course, I could be wrong. After all, this con­ver­sa­tion prob­a­bly has much more to do with the psilo­cy­bin you ate a while ago than it does any real galac­tic in­ter­ven­tion, and my mes­sage here prob­a­bly has a lot more to do with what you sus­pect, but feel con­flicted about, than any semi-di­v­ine im­per­a­tive.

Visi­tor, cont’d: In ei­ther case, the choice is the same: Do you have the courage to be a joiner in a tribe of icon­o­clasts?