The case for turning glowfic into Sequences

Epistemic status: serious, uncertain, moderate importance. Leaving comments is encouraged!

Recently Eliezer Yudkowsky’s main writing output has been rationalist glowfic: role-play fiction written on an Internet forum like glowfic.com.[1] I think that LessWrongers, fans of rationalist fiction, and anyone interested in raising the sanity waterline should consider distilling lessons from Yudkowsky glowfic into LW posts.

Here’s the basic case:

  1. The original Sequences were extremely good at building the community and raising the sanity waterline. If you want to make the impact case, I think they plausibly get multiple percent of the entire rationality community’s impact points.

  2. The Sequences are incomplete. Despite most of his knowledge coming from his home planet, Eliezer has in fact learned things since 2009. Having more sequences would be great!

  3. Eliezer’s thoughts are still relevant. Recent posts like conversations with AI researchers, calling attention to underrated ideas, and short fiction have all been good.

  4. Not everyone gets useful lessons from the Sequences, because Eliezer’s writing style and tone can be annoying. Eliezer was deliberately discourteous towards “stupid ideas”, and regrets this. Also, some people just learn better from other writing styles.

  5. Eliezer stopped writing Sequences and probably cannot write more. This is a combination of Eliezer’s chronic fatigue syndrome and being tired of trolls /​ bad takes in comments. The only medium he can write in without being drained is glowfic. Thus, even though it’s a non-serious format, glowfic is Eliezer’s main intellectual output right now.

  6. Eliezer attempts to make his glowfic roughly as edifying as HPMOR, and among people who read glowfic, some find it really good at teaching rationality.

  7. But not everyone can read glowfic and gain useful lessons.

    1. Many people (including me) read fiction for maximum enjoyment rather than to extract maximum knowledge. I had the same problem with HPMOR, reading through it like any other novel, whereas many people I know who got more from HPMOR read it carefully, perhaps stopping after every chapter to think about the goals and motivations of each character and predict what happens next.

    2. It’s really long (>>100 hours of reading time just for the existing material in the planecrash sequence) and most of the rationality lessons are contained in a small proportion of the words.

    3. It’s in a weird format; there’s no paper book or e-book version.

    4. Many of the stories have so much gratuitous sex (and often bad kink practices, torture, etc.) that they’re inappropriate for children and offputting to some adults. (I started reading HPMOR at 14 and would not recommend most 14yo read glowfic.)

I expect that if good work is produced here, it’s mostly by people who personally derived some important lesson from glowfic, and were thinking of writing it up already, whether or not it’s on the idea list below. One such person could potentially be counterfactual for getting a lot more discussion of, and context for, Eliezer’s current thoughts into the community, which I would see as a big win.

Q&A

What is glowfic and how do I read it?

There’s a LW post here explaining the format here, and also a community guide written by members of the glowfic community. Eliezer also announced the planecrash sequence in particular and linked to a website containing just planecrash.

Surely glowfic doesn’t actually contain useful information?

I’m pretty uncertain about the value of glowfic. I would update down if several people tried creating posts and none of them were good. But right now I think it’s underexplored. Some evidence on the value of glowfic:

  • + HPMOR spawned discussion of core rationalist virtues, like heroic responsibility

  • ? HPMOR didn’t have good sequences extracted from it (though maybe that’s because most of the rationality material was already in the original Sequences)

  • ? reaction to this idea from glowfic fans I know has been mixed: some are pretty enthusiastic, while some think glowfic doesn’t contain much practical rationality content

  • ? people have not written about many glowfic-derived insights yet (though maybe this is for no reason, which would make this project neglected)

  • - This post was less well received than I expected (though maybe that’s due to concern about generalizing from fictional evidence, which wouldn’t be a problem with all glowfic-derived sequences)

How should I start writing?

I don’t necessarily recommend reading rationalist glowfic just to gain shards of Eliezer’s thinking and write them up, if you don’t find it fun in itself. (If you want to do this anyway, reading the first 23 of Mad Investor Chaos is a place to start.) But if you’re already a glowfic fan, here’s a list of topics from glowfic that could be turned into posts. (Thanks to Keller Scholl for some of these.) A large class of these is “dath ilani virtue”: positive traits displayed by the civilization in Eliezer’s utopia, or its citizens when placed in other worlds.

  • An introduction to rationalist glowfic: where glowfic lives, how to read it.

  • “Lawfulness” and its facets: Bayes, expected utility, the ability to coordinate and trade, etc.

  • How Keltham analyzes everything to try to understand it as an equilibrium between rational actors, whether this works in real life, and how to do it

  • The strengths and weaknesses of glowfic as an edification tool

  • “What would Otolmens say?”[2]

  • What civilizational competence looks like

  • A list of dath ilani virtues.

  • Decision theory. Some possible topics:

    • Someone who helps you should be rewarded, even if you were not in contact with them at the time

    • Rational actors don’t respond to threats

  • Applied rationality. Some possible topics:

    • Forming hypotheses is costly, because they distort future thinking in favor of themselves, and should be avoided as long as possible

    • Evidence accumulates: so long as you track hypotheses and evidence-shifts accurately, you will converge on the truth, and reality is full of information

    • How to “introspectively experience belief updates”

There are also points in glowfic where Eliezer gives a blog post as the narrator, or gives a blog post as a character giving a lecture; such content could be posted here with minor annotations/​edits.

What not to write

If the goal is edification, I’m not particularly looking for the following artifacts (but I’d like to be proven wrong).

  • Plot summaries: I can’t see anything in the plot of glowfic I’ve read so far that’s more useful than the plot of any other fiction. (I also don’t expect these to be very fun to read)

  • Book reviews: The reviews I’ve seen so far are amusing but don’t really teach anything. Someone like Scott Alexander could write a book review that does teach things, but it doesn’t seem substantially easier than writing other glowfic-related content. (edit: since writing this I’m more excited about book reviews than I was, although they do have to be done well)

  • Broad high-context discussions: HPMOR discussions were successful, but aren’t what I’m looking for; ideally we make glowfic content accessible for people who don’t want to read glowfic.

If Eliezer can’t write nonfiction because of trolls and bad takes, won’t turning glowfic into Sequences just make him stop writing glowfic?

No, I asked him.

Seems plausibly good, but this is a dumb plan. Are there better plans?

Maybe! Here are some alternate plans:

  • get Eliezer to write enlightening short fiction rather than glowfic

  • get Eliezer to write glowfic excerpts that can be posted on LW

  • create glowfic characters for top AI researchers, and have Eliezer critique their ideas by role-playing with them (mostly a joke)

Some plans sound much less dumb but maybe intractable:

  • cure Eliezer’s chronic fatigue so he can actually attempt to grant humanity a couple more bits of information-theoretic dignity save the world

    • There was a $100,000 bounty for this that went unclaimed. Also, 5 people worked pretty seriously on it part-time for 2 years before giving up.

  • have Eliezer do more consulting with AI alignment researchers instead

    • This is already happening. I have heard that this is much more tiring than Eliezer for writing glowfic, and the glowfic is basically free, being written in his free time and not requiring nearly as much energy as consulting.

  1. ^

    Note that not all glowfic is rationalist fiction, and not all rationalist fiction is written as glowfic.

  2. ^

    In the planecrash series, Otolmens is the god of preventing existential risk.