Hi. I’m Charlie Stross and I wrote *Accelerando*. It was originally a series of novelettes (short stories) written from 1998 through 2003 and published in Asimov’s SF Magazine (they racked up five Hugo nominations along the way) before being assembled into a stitch-up novel. So it’s earlier than you think … and a lot less optimistic.
I think you missed a key point which is the narrator is Aineko and Aineko is not a cat. Aineko is an sAI that has figured out that humans are more easily interacted with/manipulated if you look like a toy or a pet than if you look like a Dalek. Aineko is not benevolent: and the human “survivors” in the final chapter aren’t even themselves, they’re simulations Aineko is running for its own reasons.
Human-style sentience is not really capable of surviving the future posited by Accelerando. The wunch, and then later the Vile Offspring, inherently out-compete us and then render us mostly extinct. By chapter 8 most of humanity is dead: all that’s left, exiled to the far corners of the solar system, are refugees (and a constant assault of AI slop-generated copies of historic personalities that constitute a DoS attack on humanity).
FWIW, my thinking today is that the whole singularitarian/TESCREAL Nexus identified by the Dair Institute folks is basically an attempt by self-avowed rationalists to re-create the Christian design patterns underlying their early socialization without actually taking on the god/jesus bullshit that the likes of Peter Thiel are at home with. Seriously: it’s all just a re-implementation of Christianity. (And this raised-Jewish guy wants none of it.)
>I see where you’re coming from with the “recreate Christianity” thing. I’m curious what you think it’d look like, to be, like, actually trying to model what the future might look like and prepare for it in some kind of sensible way, that didn’t feel that way?
Well, I’m something of a skeptic. (And what we’re seeing today is definitely not actual intelligence-in-a-box, it’s just a hype bubble being inflated by the usual silicon valley grifters to keep the dollars flowing in from the credulous: I’m pretty certain it’s going to burst in the next few months.)
I’m currently working on a far-future/space opera novel which asks, basically, what if there is no singularity, no mind uploading, no simulation afterlife, and no real route to sAIs (at least, routes accessible to human-grade intelligences), but (a) we get a mechanism for FTL expansion (this is a necessary hand-wave, or I don’t have a space opera, I have a bucket-of-crabs trapped on a single planet), and (b) TESCREAL turns out to be a design pattern for successful evangelical religions among technological civilizations? (There are holy wars. Boy are there holy wars!)
It’s a little overdue—I began it in 2015, then real life got in the way, repeatedly—but hopefully it’ll be published in the next 2-3 years (the wheels of trade fiction publishing grind slow).