As a final note: the term “Butlerian Jihad” is taken from Dune and describes the shunning of “thinking machines” by mankind.
In Dune, “thinking machines” are shunned because of a very longstanding taboo that was pretty clearly established in part by a huge, very bloody war. The intent was to make that taboo permanent, not a “pause”, and it more or less succeeded in that.
It’s a horrible metaphor and I strongly suggest people stop using it.
the Culture ending, where CEV (or similar) aligned, good ASI is created and brings us to some hypothetical utopia. Humanity enjoys a rich life in some manner compatible with your personal morals.
santa claus to 11 ending: ASI solves our problems and human development stagnates; ASI goes on to do its own thing without killing humans—but without human influence on the lightcone
Um, humans in the Culture have no significant influence on the lightcone (other than maybe as non-agentic “butterfly wings”). The Minds decide what’s going to happen. Humans opposed to that will be convinced (often via manipulation subtle enough that they don’t even know about it) or ignored. Banks struggled to even find reasons to write stories about the humans, and sometimes had to cheat to do so.
I have come to accept that some people have an attachment to the whole “human influence” thing, but how can you believe that simultaneously say the Culture is a good outcome?
The intent was to make that taboo permanent, not a “pause”, and it more or less succeeded in that.
I would not be opposed to a society stalled at 2016 level AI/computing that held that level indefinitely. Progress can certainly continue without AGI via e.g human intelligence enhancement or just sending our best and brightest to work directly on our problems instead of on zero-sum marketing or AI efforts.
Um, humans in the Culture have no significant influence on the lightcone (other than maybe as non-agentic “butterfly wings”). The Minds decide what’s going to happen
Humans were still free to leave the Culture, however; not all of the lightcone was given to the AI. Were we to develop aligned ASI, it would be wise to slice off a chunk of the lightcone for humans to work on “on their own.”
I don’t think the Culture is an ideal outcome, either, merely a “good” one that many people would be familiar with. “Uplifting” humans rather than developing replacements for them will likely lead us down a better path, although the moral alignment shift in whatever the uplifting process is might limit its utility.
In Dune, “thinking machines” are shunned because of a very longstanding taboo that was pretty clearly established in part by a huge, very bloody war. The intent was to make that taboo permanent, not a “pause”, and it more or less succeeded in that.
It’s a horrible metaphor and I strongly suggest people stop using it.
Um, humans in the Culture have no significant influence on the lightcone (other than maybe as non-agentic “butterfly wings”). The Minds decide what’s going to happen. Humans opposed to that will be convinced (often via manipulation subtle enough that they don’t even know about it) or ignored. Banks struggled to even find reasons to write stories about the humans, and sometimes had to cheat to do so.
I have come to accept that some people have an attachment to the whole “human influence” thing, but how can you believe that simultaneously say the Culture is a good outcome?
I would not be opposed to a society stalled at 2016 level AI/computing that held that level indefinitely. Progress can certainly continue without AGI via e.g human intelligence enhancement or just sending our best and brightest to work directly on our problems instead of on zero-sum marketing or AI efforts.
Humans were still free to leave the Culture, however; not all of the lightcone was given to the AI. Were we to develop aligned ASI, it would be wise to slice off a chunk of the lightcone for humans to work on “on their own.”
I don’t think the Culture is an ideal outcome, either, merely a “good” one that many people would be familiar with. “Uplifting” humans rather than developing replacements for them will likely lead us down a better path, although the moral alignment shift in whatever the uplifting process is might limit its utility.