31 Laws of Fun

So this is Utopia, is it? Well
I beg your par­don, I thought it was Hell.
-- Sir Max Beer­holm, verse en­ti­tled
In a Copy of More’s (or Shaw’s or Wells’s or Plato’s or Any­body’s) Utopia

This is a shorter sum­mary of the Fun The­ory Se­quence with all the back­ground the­ory left out—just the com­pressed ad­vice to the would-be au­thor or fu­tur­ist who wishes to imag­ine a world where peo­ple might ac­tu­ally want to live:

  1. Think of a typ­i­cal day in the life of some­one who’s been adapt­ing to Utopia for a while. Don’t an­chor on the first mo­ment of “hear­ing the good news”. Heaven’s “You’ll never have to work again, and the streets are paved with gold!” sounds like good news to a tired and poverty-stricken peas­ant, but two months later it might not be so much fun. (Prole­gom­ena to a The­ory of Fun.)

  2. Be­ware of pack­ing your Utopia with things you think peo­ple should do that aren’t ac­tu­ally fun. Again, con­sider Chris­tian Heaven: singing hymns doesn’t sound like loads of end­less fun, but you’re sup­posed to en­joy pray­ing, so no one can point this out. (Prole­gom­ena to a The­ory of Fun.)

  3. Mak­ing a video game eas­ier doesn’t always im­prove it. The same holds true of a life. Think in terms of clear­ing out low-qual­ity drudgery to make way for high-qual­ity challenge, rather than elimi­nat­ing work. (High Challenge.)

  4. Life should con­tain nov­elty—ex­pe­riences you haven’t en­coun­tered be­fore, prefer­ably teach­ing you some­thing you didn’t already know. If there isn’t a suffi­cient sup­ply of nov­elty (rel­a­tive to the speed at which you gen­er­al­ize), you’ll get bored. (Com­plex Novelty.)

  1. Peo­ple should get smarter at a rate suffi­cient to in­te­grate their old ex­pe­riences, but not so much smarter so fast that they can’t in­te­grate their new in­tel­li­gence. Be­ing smarter means you get bored faster, but you can also tackle new challenges you couldn’t un­der­stand be­fore. (Com­plex Novelty.)

  2. Peo­ple should live in a world that fully en­gages their senses, their bod­ies, and their brains. This means ei­ther that the world re­sem­bles the an­ces­tral sa­vanna more than say a win­dowless office; or al­ter­na­tively, that brains and bod­ies have changed to be fully en­gaged by differ­ent kinds of com­pli­cated challenges and en­vi­ron­ments. (Fic­tions in­tended to en­ter­tain a hu­man au­di­ence should con­cen­trate pri­mar­ily on the former op­tion.) (Sen­sual Ex­pe­rience.)

  3. Ti­mothy Fer­ris: “What is the op­po­site of hap­piness? Sad­ness? No. Just as love and hate are two sides of the same coin, so are hap­piness and sad­ness… The op­po­site of love is in­differ­ence, and the op­po­site of hap­piness is—here’s the clincher—bore­dom… The ques­tion you should be ask­ing isn’t ‘What do I want?’ or ‘What are my goals?’ but ‘What would ex­cite me?’… Liv­ing like a mil­lion­aire re­quires do­ing in­ter­est­ing things and not just own­ing en­vi­able things.” (Ex­is­ten­tial Angst Fac­tory.)

  4. Any par­tic­u­lar in­di­vi­d­ual’s life should get bet­ter and bet­ter over time. (Con­tin­u­ous Im­prove­ment.)

  5. You should not know ex­actly what im­prove­ments the fu­ture holds, al­though you should look for­ward to find­ing out. The ac­tual event should come as a pleas­ant sur­prise. (Jus­tified Ex­pec­ta­tion of Pleas­ant Sur­prises.)

  6. Our hunter-gath­erer an­ces­tors strung their own bows, wove their own bas­kets and whit­tled their own flutes; then they did their own hunt­ing, their own gath­er­ing and played their own mu­sic. Fu­tur­is­tic Utopias are of­ten de­picted as offer­ing more and more neat but­tons that do less and less com­pre­hen­si­ble things for you. Ask not what in­ter­est­ing things Utopia can do for peo­ple; ask rather what in­ter­est­ing things the in­hab­itants could do for them­selves—with their own brains, their own bod­ies, or tools they un­der­stand how to build. (Liv­ing By Your Own Strength.)

  7. Liv­ing in Eu­topia should make peo­ple stronger, not weaker, over time. The in­hab­itants should ap­pear more formidable than the peo­ple of our own world, not less. (Liv­ing By Your Own Strength; see also, Tsuyoku Nar­i­tai.)

  8. Life should not be bro­ken up into a se­ries of dis­con­nected epi­sodes with no long-term con­se­quences. No mat­ter how sen­sual or com­plex, play­ing one re­ally great video game af­ter an­other, does not make a life story. (Emo­tional In­volve­ment.)

  9. Peo­ple should make their own des­tinies; their lives should not be chore­ographed to the point that they no longer need to imag­ine, plan and nav­i­gate their own fu­tures. Ci­ti­zens should not be the pawns of more pow­er­ful gods, still less their sculpted ma­te­rial. One sim­ple solu­tion would be to have the world work by sta­ble rules that are the same for ev­ery­one, where the bur­den of Eu­topia is car­ried by a good ini­tial choice of rules, rather than by any op­ti­miza­tion pres­sure ap­plied to in­di­vi­d­ual lives. (Free to Op­ti­mize.)

  10. Hu­man minds should not have to play on a level field with vastly su­pe­rior en­tities. Most peo­ple don’t like be­ing over­shad­owed. Gods de­stroy a hu­man pro­tag­o­nist’s “main char­ac­ter” sta­tus; this is un­de­sir­able in fic­tion and prob­a­bly in real life. (E.g.: C. S. Lewis’s Nar­nia, Iain Banks’s Cul­ture.) Either change peo­ple’s emo­tional makeup so that they don’t mind be­ing un­nec­es­sary, or keep the gods way off their play­ing field. Fic­tional sto­ries in­tended for hu­man au­di­ences can­not do the former. (And in real life, you prob­a­bly can have pow­er­ful AIs that are nei­ther sen­tient nor med­dle­some. See the main post and its pre­req­ui­sites.) (Am­pu­ta­tion of Destiny.)

  11. Try­ing to com­pete on a sin­gle flat play­ing field with six billion other hu­mans also cre­ates prob­lems. Our an­ces­tors lived in bands of around 50 peo­ple. To­day the me­dia is con­stantly bom­bard­ing us with news of ex­cep­tion­ally rich and pretty peo­ple as if they lived next door to us; and very few peo­ple get a chance to be the best at any spe­cialty. (Dun­bar’s Func­tion.)

  12. Our an­ces­tors also had some de­gree of gen­uine con­trol over their band’s poli­tics. Con­trast to mod­ern na­tion-states where al­most no one knows the Pres­i­dent on a per­sonal level or could ar­gue Congress out of a bad de­ci­sion. (Though that doesn’t stop peo­ple from ar­gu­ing as loudly as if they still lived in a 50-per­son band.) (Dun­bar’s Func­tion.)

  13. Offer­ing peo­ple more op­tions is not always helping them (es­pe­cially if the op­tion is some­thing they couldn’t do for them­selves). Losses are more painful than the cor­re­spond­ing gains, so if choices are differ­ent along many di­men­sions and only one choice can be taken, peo­ple tend to fo­cus on the loss of the road not taken. Offer­ing a road that by­passes a challenge makes the challenge feel less real, even if the cheat is dili­gently re­fused. It is also a sad fact that hu­mans pre­dictably make cer­tain kinds of mis­takes. Don’t as­sume that build­ing more choice into your Utopia is nec­es­sar­ily an im­prove­ment be­cause “peo­ple can always just say no”. This sounds re­as­sur­ing to an out­side reader—“Don’t worry, you’ll de­cide! You trust your­self, right?”—but might not be much fun to ac­tu­ally live with. (Harm­ful Op­tions.)

  14. Ex­treme ex­am­ple of the above: be­ing con­stantly offered huge temp­ta­tions that are in­cred­ibly dan­ger­ous—a com­pletely re­al­is­tic vir­tual world, or very ad­dic­tive and plea­surable drugs. You can never al­low your­self a sin­gle mo­ment of willpower failure over your whole life. (E.g.: John C. Wright’s Golden Oe­c­umene.) (Devil’s Offers.)

  15. Con­versely, when peo­ple are grown strong enough to shoot off their feet with­out ex­ter­nal help, stop­ping them may be too much in­terfer­ence. Hope­fully they’ll then be smart enough not to: By the time they can build the gun, they’ll know what hap­pens if they pull the gun, and won’t need a smoth­er­ing safety blan­ket. If that’s the the­ory, then dan­ger­ous op­tions need cor­re­spond­ingly difficult locks. (Devil’s Offers.)

  16. Tel­ling peo­ple truths they haven’t yet figured out for them­selves, is not always helping them. (Joy in Dis­cov­ery.)

  17. Brains are some of the most com­pli­cated things in the world. Thus, other hu­mans (other minds) are some of the most com­pli­cated things we deal with. For us, this in­ter­ac­tion has a unique char­ac­ter be­cause of the sym­pa­thy we feel for oth­ers—the way that our brain tends to al­ign with their brain—rather than our brain just treat­ing other brains as big com­pli­cated ma­chines with lev­ers to pull. Re­duc­ing the need for peo­ple to in­ter­act with other peo­ple re­duces the com­plex­ity of hu­man ex­is­tence; this is a step in the wrong di­rec­tion. For ex­am­ple, re­sist the temp­ta­tion to sim­plify peo­ple’s lives by offer­ing them ar­tifi­cially perfect sex­ual/​ro­man­tic part­ners. (In­ter­per­sonal En­tan­gle­ment.)

  18. But ad­mit­tedly, hu­man­ity does have a statis­ti­cal sex prob­lem: the male dis­tri­bu­tion of at­tributes doesn’t har­mo­nize with the fe­male dis­tri­bu­tion of de­sires, or vice versa. Not ev­ery­thing in Eu­topia should be easy—but it shouldn’t be pointlessly, un­re­solv­ably frus­trat­ing ei­ther. (This is a gen­eral prin­ci­ple.) So imag­ine nudg­ing the dis­tri­bu­tions to make the prob­lem solv­able—rather than wav­ing a magic wand and solv­ing ev­ery­thing in­stantly. (In­ter­per­sonal En­tan­gle­ment.)

  19. In gen­eral, tam­per­ing with brains, minds, emo­tions, and per­son­al­ities is way more fraught on ev­ery pos­si­ble level of ethics and difficulty, than tam­per­ing with bod­ies and en­vi­ron­ments. Always ask what you can do by mess­ing with the en­vi­ron­ment be­fore you imag­ine mess­ing with minds. Then pre­fer small cog­ni­tive changes to big ones. You’re not just out­run­ning your hu­man au­di­ence, you’re out­run­ning your own imag­i­na­tion. (Chang­ing Emo­tions.)

  20. In this pre­sent world, there is an im­bal­ance be­tween plea­sure and pain. An un­skil­led tor­turer with sim­ple tools can cre­ate worse pain in thirty sec­onds, than an ex­tremely skil­led sex­ual artist can cre­ate plea­sure in thirty min­utes. One re­sponse would be to rem­edy the im­bal­ance—to have the world con­tain more joy than sor­row. Pain might ex­ist, but not pointless end­less un­en­durable pain. Mis­takes would have more pro­por­tionate penalties: You might touch a hot stove and end up with a painful blister; but not glance away for two sec­onds and spend the rest of your life in a wheelchair. The peo­ple would be stronger, less ex­hausted. This path would elimi­nate mind-de­stroy­ing pain, and make plea­sure more abun­dant. Another path would elimi­nate pain en­tirely. What­ever the rel­a­tive mer­its of the real-world pro­pos­als, fic­tional sto­ries can­not take the sec­ond path. (Se­ri­ous Sto­ries.)

  21. Ge­orge Or­well once ob­served that Utopias are chiefly con­cerned with avoid­ing fuss. Don’t be afraid to write a loud Eu­topia that might wake up the neigh­bors. (Eu­topia is Scary; Ge­orge Or­well’s Why So­cial­ists Don’t Believe in Fun.)

  22. Ge­orge Or­well ob­served that “The in­hab­itants of perfect uni­verses seem to have no spon­ta­neous gaiety and are usu­ally some­what re­pul­sive into the bar­gain.” If you write a story and your char­ac­ters turn out like this, it prob­a­bly re­flects some much deeper flaw that can’t be fixed by hav­ing the State hire a few clowns. (Ge­orge Or­well’s Why So­cial­ists Don’t Believe in Fun.)

  23. Ben Fran­klin, yanked into our own era, would be sur­prised and delighted by some as­pects of his Fu­ture. Other as­pects would hor­rify, dis­gust, and frighten him; and this is not be­cause our world has gone wrong, but be­cause it has im­proved rel­a­tive to his time. Rel­a­tively few things would have gone just as Ben Fran­klin ex­pected. If you imag­ine a world which your imag­i­na­tion finds fa­mil­iar and com­fort­ing, it will in­spire few oth­ers, and the whole ex­er­cise will lack in­tegrity. Try to con­ceive of a gen­uinely bet­ter world in which you, your­self, would be shocked (at least at first) and out of place (at least at first). (Eu­topia is Scary.)

  24. Utopia and Dystopia are two sides of the same coin; both just con­firm the moral sen­si­bil­ities you started with. Whether the world is a liber­tar­ian utopia of gov­ern­ment non-in­terfer­ence, or a hellish dystopia of gov­ern­ment in­tru­sion and reg­u­la­tion, you get to say “I was right all along.” Don’t just imag­ine some­thing that con­forms to your ex­ist­ing ideals of gov­ern­ment, re­la­tion­ships, poli­tics, work, or daily life. Find the bet­ter world that zogs in­stead of zig­ging or zag­ging. (To safe­guard your sen­si­bil­ities, you can tell your­self it’s just an ar­guably bet­ter world but isn’t re­ally bet­ter than your fa­vorite stan­dard Utopia… but you’ll know you’re re­ally do­ing it right if you find your ideals chang­ing.) (Build­ing Weird­topia.)

  25. If your Utopia still seems like an end­less gloomy drudgery of ex­is­ten­tial angst no mat­ter how much you try to brighten it, there’s at least one ma­jor prob­lem that you’re en­tirely failing to fo­cus on. (Ex­is­ten­tial Angst Fac­tory.)

  26. ’Tis a sad mind that cares about noth­ing ex­cept it­self. In the mod­ern-day world, if an al­tru­ist looks around, their eye is caught by large groups of peo­ple in des­per­ate jeop­ardy. Peo­ple in a bet­ter world will not see this: A true Eu­topia will run low on vic­tims to be res­cued. This doesn’t im­ply that the in­hab­itants look around out­side them­selves and see noth­ing. They may care about friends and fam­ily, truth and free­dom, com­mon pro­jects; out­side minds, shared goals, and high ideals. (Higher Pur­pose.)

  27. Still, a story that con­fronts the challenge of Eu­topia should not just have the con­ve­nient plot of “The Dark Lord Sau­ron is about to in­vade and kill ev­ery­body”. The would-be au­thor will have to find some­thing slightly less awful for his char­ac­ters to le­gi­t­i­mately care about. This is part of the challenge of show­ing that hu­man progress is not the end of hu­man sto­ries, and that peo­ple not in im­mi­nent dan­ger of death can still lead in­ter­est­ing lives. Those of you in­ter­ested in con­fronting lethal plane­tary-sized dan­gers should fo­cus on pre­sent-day real life. (Higher Pur­pose.)

The si­mul­ta­neous solu­tion of all these de­sign re­quire­ments is left as an ex­er­cise to the reader. At least for now.

The enu­mer­a­tion in this post of cer­tain Laws shall not be con­strued to deny or dis­par­age oth­ers not men­tioned. I didn’t hap­pen to write about hu­mor, but it would be a sad world that held no laugh­ter, etcetera.

To any­one se­ri­ously in­ter­ested in try­ing to write a Eu­topian story us­ing these Laws: You must first know how to write. There are many, many books on how to write; you should read at least three; and they will all tell you that a great deal of prac­tice is re­quired. Your prac­tice sto­ries should not be com­posed any­where so difficult as Eu­topia. That said, my sec­ond most im­por­tant ad­vice for au­thors is this: Life will never be­come bor­ingly easy for your char­ac­ters so long as they can make things difficult for each other.

Fi­nally, this dire warn­ing: Con­cretely imag­in­ing wor­lds much bet­ter than your pre­sent-day real life, may suck out your soul like an emo­tional vac­uum cleaner. (See Se­duced by Imag­i­na­tion.) Fun The­ory is dan­ger­ous, use it with cau­tion, you have been warned.